Exploring Multi-Modality in Animal-Centered Computing

Ariel Oren

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Automated approaches to animal behavior and emotion recognition and analysis has great potential for expanding scientific knowledge about animals and their interactions with humans, as well as assessment of animal welfare and other applications promoting animal healthcare and well-being. While state-of-the-art AI-based approaches in human behavior and sentiment analysis use multi-modal approaches, most research efforts related to animals have so far addressed one modality at a time: video, audio or sensor data. In this research I aim to fill this gap by exploring multi-modal approaches in the context animal behavior.

Original languageEnglish
Title of host publicationACI 2021 - 8th International Conference on Animal-Computer Interaction, Proceedings
PublisherAssociation for Computing Machinery
ISBN (Electronic)9781450385138
DOIs
StatePublished - 8 Nov 2021
Event8th International Conference on Animal-Computer Interaction, ACI 2021 - Virtual, Online, United States
Duration: 8 Nov 202111 Nov 2021

Publication series

NameACM International Conference Proceeding Series

Conference

Conference8th International Conference on Animal-Computer Interaction, ACI 2021
Country/TerritoryUnited States
CityVirtual, Online
Period8/11/2111/11/21

Bibliographical note

Publisher Copyright:
© 2021 Owner/Author.

Keywords

  • Computer vision
  • Multi-modal data fusion
  • Signal processing

ASJC Scopus subject areas

  • Software
  • Human-Computer Interaction
  • Computer Vision and Pattern Recognition
  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'Exploring Multi-Modality in Animal-Centered Computing'. Together they form a unique fingerprint.

Cite this