Titlebar

Export bibliographic data
Literature by the same author
plus on the publication server
plus at Google Scholar

 

Ergonomics for the design of multimodal interfaces

Title data

Heloir, Alexis ; Nunnari, Fabrizio ; Bachynskyi, Myroslav:
Ergonomics for the design of multimodal interfaces.
In: Oviatt, Sharon (ed.): The Handbook of Multimodal-Multisensor Interfaces. Vol 3. Language Processing, Software, Commercialization, and Emerging Direction. - New York, NY : Association for Computing Machinery , 2019 . - pp. 263-304
ISBN 978-1-970001-72-3
DOI: https://doi.org/10.1145/3233795.3233804

Abstract in another language

There are many ways a machine can infer a user intention or her/his cognitive and affective states: voice, voluntary movements, skin conductivity, eye movement, or muscle activation, to name a few. Voluntary movement, however, is still the privileged input channel for multimodal interfaces: it can be a button press, a mouse-mediated aimed movement, a direct touch on a screen, a mid-air gesture, or a full-body movement. The recent development of touch and motion-sensing technology broadens the interaction space by extending the number of input effectors: not only the fingertips but also the whole body now have the potential to support future input strategies. Indeed, touchscreens, inertial measurement units (IMU), as well as RGB, stereo, and time of flight (ToF) camera sensors will eventually become standard components of ubiquitous multimodal systems.

Further data

Item Type: Article in a book
Refereed: Yes
Institutions of the University: Faculties > Faculty of Mathematics, Physics und Computer Science > Department of Computer Science > Chair Applied Computer Science VIII > Chair Applied Computer Science VIII - Univ.-Prof. Dr. Jörg Müller
Result of work at the UBT: No
DDC Subjects: 000 Computer Science, information, general works
Date Deposited: 06 Apr 2020 08:56
Last Modified: 06 Apr 2020 08:56
URI: https://eref.uni-bayreuth.de/id/eprint/54831