Exploring potential gestures for controlling an eye-Tracker based system

Ilana Arzis, Moayad Mokatren, Yasmin Felberbaum, Tsvi Kuflik

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Body gestures could be used as an intuitive interaction method with computerized systems. Previous studies explored gesture-based interaction mostly with digital displays, thus there is no standard set of gestures for a system that lacks a display. In this work we conducted a pilot study to explore the potential of using gestures to control an eye tracker based mobile museum visitors' guide. Our objective was to identify a user-defined set of gestures for controlling the mobile guide. In this work we present the preliminary results of the experiment and discuss the participants' suggestions and concerns for using this type of interaction.

Original languageEnglish
Title of host publicationProceedings of the 20th International Conference on Mobile and Ubiquitous Multimedia, MUM 2021
PublisherAssociation for Computing Machinery
Pages211-213
Number of pages3
ISBN (Electronic)9781450386432
DOIs
StatePublished - 12 May 2021
Event20th International Conference on Mobile and Ubiquitous Multimedia, MUM 2021 - Virtual, Online, Belgium
Duration: 5 Dec 20218 Dec 2021

Publication series

NameACM International Conference Proceeding Series

Conference

Conference20th International Conference on Mobile and Ubiquitous Multimedia, MUM 2021
Country/TerritoryBelgium
CityVirtual, Online
Period5/12/218/12/21

Bibliographical note

Publisher Copyright:
© 2021 Owner/Author.

Keywords

  • Body gestures
  • Cultural heritage
  • Eye tacking
  • Multimodality

ASJC Scopus subject areas

  • Software
  • Human-Computer Interaction
  • Computer Vision and Pattern Recognition
  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'Exploring potential gestures for controlling an eye-Tracker based system'. Together they form a unique fingerprint.

Cite this