Latest News

sciencenews.png

Automated pain assessment based on facial expressions of free-moving mice — New AI-based technology that contributes to animal welfare developed by the University of Tokyo

2025.12.12

A research group led by Project Lecturer Koji Kobayashi and Associate Professor Takahisa Murata of the Graduate School of Agricultural and Life Sciences at the University of Tokyo has developed a new analytical method that can automatically assess pain from the "facial expressions" of mice using AI. This represents a major advance not only in dramatically enhancing reliability in elucidating pain mechanisms and drug discovery research, but also from an animal welfare perspective.

The Mouse Grimace Scale reported in 2010 is widely used as a method for scoring pain based on facial grimacing. However, this scoring requires observer expertise, shows variability between evaluators, and is not suitable for long-term observation.

The research team trained an AI model (convolutional neural network, CNN) using approximately 540,000 facial images of BALB/c mice. By training with facial images of mice injected intraperitoneally with acetic acid (pain state) and untreated mice (no pain state), the AI was designed to autonomously extract features distinguishing "pain" from "no pain." CNN demonstrated high accuracy even with data not used for training, accurately predicting concentration-dependent changes in pain stimuli. When the analgesic drug diclofenac was administered, CNN automatically detected and quantified pain reduction.

Figure 1. Pain assessment from mouse facial images by CNN. CNN used cropped facial images from video files to calculate the probability of pain.
Provided by the University of Tokyo

This demonstrated that CNN can detect even the effects of drugs from "subtle facial changes." Furthermore, it accurately identified pain caused by stimuli other than acetic acid, such as capsaicin (a component of chili peppers) and calcitonin gene-related peptide (CGRP). In other words, it has learned the common "facial expressions" across different types of pain.

The research team analyzed the facial regions CNN focused on during pain assessment using a visualization technique called Grad-CAM. The results showed that it focused on the ears, cheeks, and mouth in the "no pain" state, while focusing on the forehead and head in the "pain" state. This discovery indicates that AI utilizes a broader range of facial expression changes than traditional human observation, providing new scientific insights into the "expression" of pain.

Figure 2. Visualization of the CNN-focused facial regions.
Grad-CAM analysis revealed the areas that CNN focused on to calculate pain probability.
Provided by the University of Tokyo

This is expected to become a foundational technology for standardizing analgesic drug evaluation in drug discovery and toxicity testing using animal experiments, as well as for long-term emotional analysis (discomfort, fear, pleasure, etc.).

Journal Information
Publication: PNAS Nexus
Title: Automated pain assessment based on facial expression of free-moving mice
DOI: 10.1093/pnasnexus/pgaf352

This article has been translated by JST with permission from The Science News Ltd. (https://sci-news.co.jp/). Unauthorized reproduction of the article and photographs is prohibited.

Back to Latest News

Latest News

Recent Updates

    Most Viewed