TY - GEN
T1 - Multimodality Evaluation Metrics for Human-Robot Interaction Needed
T2 - International Conference on Applied Human Factors and Ergonomics
AU - Aaltonen, Iina
AU - Aromaa, Susanna
AU - Helin, Kaj
AU - Muhammad, Ali
N1 - Funding Information:
We thank test participants. Research funded under the European Commission’s Seventh Framework Aeronautics and Air Transport (AAT) programme in the project VR-HYPERSPACE (AAT-285681) ‘The innovative use of virtual and mixed reality to increase human comfort by changing the perception of self and space’. The work of I. Aaltonen was supported by Wihuri Foundation.
Funding Information:
Acknowledgements. We thank test participants. Research funded under the European Commission’s Seventh Framework Aeronautics and Air Transport (AAT) programme in the project VR-HYPERSPACE (AAT-285681) ‘The innovative use of virtual and mixed reality to increase human comfort by changing the perception of self and space’. The work of I. Aaltonen was supported by Wihuri Foundation.
Publisher Copyright:
© Springer International Publishing AG 2018.
Copyright:
Copyright 2018 Elsevier B.V., All rights reserved.
PY - 2018/1/1
Y1 - 2018/1/1
N2 - Multimodal, wearable technologies have the potential to enable a completely immersive teleoperation experience, which can be beneficial for a number of teleoperated robotic applications. To gain the full benefit of these technologies, understanding the user perspective of human-robot interaction (HRI) is of special relevance for highly advanced telerobotic systems in the future. In telerobotics research, however, the complex nature of multimodal interaction has not attracted much attention. We studied HRI with a wearable multimodal control system used for teleoperating a mobile robot, and recognized a need for evaluation metrics for multimodality. In the case study, questionnaires, interviews, observations and video analysis were used to evaluate usability, ergonomics, immersion, and the nature of multimodal interaction. Although the technical setup was challenging, our findings provide insights to the design and evaluation of user interaction of future immersive teleoperation systems. We propose new HRI evaluation metrics: Type of multimodal interaction and Wearability.
AB - Multimodal, wearable technologies have the potential to enable a completely immersive teleoperation experience, which can be beneficial for a number of teleoperated robotic applications. To gain the full benefit of these technologies, understanding the user perspective of human-robot interaction (HRI) is of special relevance for highly advanced telerobotic systems in the future. In telerobotics research, however, the complex nature of multimodal interaction has not attracted much attention. We studied HRI with a wearable multimodal control system used for teleoperating a mobile robot, and recognized a need for evaluation metrics for multimodality. In the case study, questionnaires, interviews, observations and video analysis were used to evaluate usability, ergonomics, immersion, and the nature of multimodal interaction. Although the technical setup was challenging, our findings provide insights to the design and evaluation of user interaction of future immersive teleoperation systems. We propose new HRI evaluation metrics: Type of multimodal interaction and Wearability.
KW - human-robot interaction
KW - immersion
KW - metrics
KW - multimodal
KW - telerobotics
KW - user studies
KW - wearable
UR - http://www.scopus.com/inward/record.url?scp=85041041850&partnerID=8YFLogxK
U2 - 10.1007/978-3-319-60384-1_32
DO - 10.1007/978-3-319-60384-1_32
M3 - Conference article in proceedings
SN - 978-3-319-60383-4
SN - 978-3-319-60384-1
T3 - Advances in Intelligent Systems and Computing
SP - 335
EP - 347
BT - Advances in Human Factors in Robots and Unmanned Systems. AHFE 2017
A2 - Chen, Jessie
PB - Springer
Y2 - 17 July 2017 through 21 July 2017
ER -