DoorINet: Door Heading Prediction through Inertial Deep Learning

Aleksei Zakharchenko, Sharon Farber, Itzik Klein

Research output: Contribution to journalArticlepeer-review

Abstract

Inertial sensors are widely used in a variety of applications. A common task is orientation estimation. To tackle such a task, attitude and heading reference system algorithms are applied. Relying on the gyroscope readings, the accelerometer measurements are used to update the attitude angles, and magnetometer measurements are utilized to update the heading angle. In indoor environments, magnetometers suffer from interference that degrades their performance resulting in poor heading angle estimation. Therefore, applications that estimate the heading angle of moving objects, such as walking pedestrians, closets, and refrigerators, are prone to error. To circumvent such situations, we propose DoorINet, an end-to-end deep-learning framework to calculate the heading angle from door-mounted, low-cost inertial sensors without using magnetometers. To evaluate our approach, we record a unique dataset containing 391 minutes of accelerometer and gyroscope measurements and corresponding ground-truth heading angle. We show that our proposed approach outperforms commonly used, model based approaches and data-driven methods. To enable reproducibility of our results and future research, both code and data are available at https://github.com/ansfl/DoorINet.

Original languageEnglish
Pages (from-to)1
Number of pages1
JournalIEEE Sensors Journal
DOIs
StateAccepted/In press - 2024

Bibliographical note

Publisher Copyright:
IEEE

Keywords

  • Accelerometers
  • AHRS
  • deep-learning
  • Estimation
  • Gyroscopes
  • heading angle
  • inertial sensors
  • Logic gates
  • Magnetometers
  • Sensors
  • Vectors

ASJC Scopus subject areas

  • Instrumentation
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'DoorINet: Door Heading Prediction through Inertial Deep Learning'. Together they form a unique fingerprint.

Cite this