Boosting Inertial-Based Human Activity Recognition with Transformers

Yoli Shavit, Itzik Klein

Research output: Contribution to journalArticlepeer-review

Abstract

Activity recognition problems such as human activity recognition and smartphone location recognition can improve the accuracy of different navigation or healthcare tasks, which rely solely on inertial sensors. Current learning-based approaches for activity recognition from inertial data employ convolutional neural networks or long short term memory architectures. Recently, Transformers were shown to outperform these architectures for sequence analysis tasks. This work presents an activity recognition model based on Transformers which offers an improved and general framework for learning activity recognition tasks. For evaluation purposes, several datasets, with more than 27 hours of inertial data recordings collected by 91 users, are employed. Those datasets represent different user activity scenarios with varying difficulty. The proposed approach consistently achieves better accuracy and generalizes better across all examined datasets and scenarios. A codebase implementing the described framework is available at: http://github.com/yolish/har-with-imu-transformer.

Original languageEnglish
Article number9393889
Pages (from-to)53540-53547
Number of pages8
JournalIEEE Access
Volume9
DOIs
StatePublished - 2021

Bibliographical note

Publisher Copyright:
© 2013 IEEE.

Keywords

  • Human activity recognition
  • Transformers
  • convolutional neural networks
  • inertial sensors
  • pedestrian dead reckoning
  • sequence analysis
  • smartphone location recognition

ASJC Scopus subject areas

  • General Computer Science
  • General Materials Science
  • General Engineering

Fingerprint

Dive into the research topics of 'Boosting Inertial-Based Human Activity Recognition with Transformers'. Together they form a unique fingerprint.

Cite this