Facial expressions in non-human animals are closely linked to their internal affective states, with the majority of empirical work focusing on facial shape changes associated with pain. However, existing tools for facial expression analysis are prone to human subjectivity and bias, and in many cases also require special expertise and training. This paper presents the first comparative study of two different paths towards automatizing pain recognition in facial images of domestic short haired cats (n = 29), captured during ovariohysterectomy at different time points corresponding to varying intensities of pain. One approach is based on convolutional neural networks (ResNet50), while the other—on machine learning models based on geometric landmarks analysis inspired by species specific Facial Action Coding Systems (i.e. catFACS). Both types of approaches reach comparable accuracy of above 72%, indicating their potential usefulness as a basis for automating cat pain detection from images.
Bibliographical noteFunding Information:
The research was partially supported by the grant from the Ministry of Science and Technology of Israel according to the research project no. 19-57-06007 and by the Israel Ministry of Agriculture and Rural Development. The first author was additionally supported by the Data Science Research Center (DSRC), University of Haifa. The authors would like to thank Shir Amir for her scientific advice, Nareed Farhat and Ephantus Kanyugi for their help with data management, and Yaron Yossef for his technical support at all stages of this work.
© 2022, The Author(s).
ASJC Scopus subject areas