Abstract
Body gestures could be used as an intuitive interaction method with computerized systems. Previous studies explored gesture-based interaction mostly with digital displays, thus there is no standard set of gestures for a system that lacks a display. In this work we conducted a pilot study to explore the potential of using gestures to control an eye tracker based mobile museum visitors' guide. Our objective was to identify a user-defined set of gestures for controlling the mobile guide. In this work we present the preliminary results of the experiment and discuss the participants' suggestions and concerns for using this type of interaction.
Original language | English |
---|---|
Title of host publication | Proceedings of the 20th International Conference on Mobile and Ubiquitous Multimedia, MUM 2021 |
Publisher | Association for Computing Machinery |
Pages | 211-213 |
Number of pages | 3 |
ISBN (Electronic) | 9781450386432 |
DOIs | |
State | Published - 12 May 2021 |
Event | 20th International Conference on Mobile and Ubiquitous Multimedia, MUM 2021 - Virtual, Online, Belgium Duration: 5 Dec 2021 → 8 Dec 2021 |
Publication series
Name | ACM International Conference Proceeding Series |
---|
Conference
Conference | 20th International Conference on Mobile and Ubiquitous Multimedia, MUM 2021 |
---|---|
Country/Territory | Belgium |
City | Virtual, Online |
Period | 5/12/21 → 8/12/21 |
Bibliographical note
Publisher Copyright:© 2021 Owner/Author.
Keywords
- Body gestures
- Cultural heritage
- Eye tacking
- Multimodality
ASJC Scopus subject areas
- Software
- Human-Computer Interaction
- Computer Vision and Pattern Recognition
- Computer Networks and Communications