TY - GEN
T1 - How Indirect and Direct Interaction Affect the Trustworthiness in Normal and Explainable Human-Robot Interaction
AU - Pham, Truong An
AU - Espinosa-Leal, Leonardo
PY - 2024
Y1 - 2024
N2 - Human-robot interaction (HRI) attracts significant attention from the public due to the ubiquity of robots in factories, restaurants, and even at home. However, the engagement of users in interacting with the robot is still a question mark due to the challenging trustworthiness. The trustworthiness becomes more complicated when discussing indirect interaction – humans observe the robot – and direct interaction – humans and robots may interact or not interact when being close to each other – in the robotic design. Several studies were conducted to analyze human trust in either indirect or direct aspects of robotic systems. The shortage of benchmarking indirect interaction and direct interaction initiates a significant gap in designing and developing a more subtle robotic system in complex scenarios that involve different stakeholders, such as users and observers, known as indirect users. In this study, we propose a novel guideline for evaluating such robotic systems in human-robot interaction. Particularly, we analyze differences between indirect and direct interaction about human trustworthiness in HRI. In addition, we also investigate the simulation methodology including virtual reality and video to evaluate a human-robot interaction scenario in both normal and explainable robotic systems by integrating a visual feedback module. By conducting quantitative and qualitative experiments, there is no significant difference between indirect and direct interaction in the trustworthiness of HRI. Instead, the explainable feature is recognized as the key factor in improving the trustworthiness of a robotic system.
AB - Human-robot interaction (HRI) attracts significant attention from the public due to the ubiquity of robots in factories, restaurants, and even at home. However, the engagement of users in interacting with the robot is still a question mark due to the challenging trustworthiness. The trustworthiness becomes more complicated when discussing indirect interaction – humans observe the robot – and direct interaction – humans and robots may interact or not interact when being close to each other – in the robotic design. Several studies were conducted to analyze human trust in either indirect or direct aspects of robotic systems. The shortage of benchmarking indirect interaction and direct interaction initiates a significant gap in designing and developing a more subtle robotic system in complex scenarios that involve different stakeholders, such as users and observers, known as indirect users. In this study, we propose a novel guideline for evaluating such robotic systems in human-robot interaction. Particularly, we analyze differences between indirect and direct interaction about human trustworthiness in HRI. In addition, we also investigate the simulation methodology including virtual reality and video to evaluate a human-robot interaction scenario in both normal and explainable robotic systems by integrating a visual feedback module. By conducting quantitative and qualitative experiments, there is no significant difference between indirect and direct interaction in the trustworthiness of HRI. Instead, the explainable feature is recognized as the key factor in improving the trustworthiness of a robotic system.
KW - augmented reality
KW - human-robot interaction
KW - virtual reality
UR - http://www.scopus.com/inward/record.url?scp=85197467350&partnerID=8YFLogxK
U2 - 10.1007/978-3-031-61905-2_40
DO - 10.1007/978-3-031-61905-2_40
M3 - Conference article in proceedings
SN - 978-3-031-61904-5
VL - 2
T3 - Lecture Notes in Networks and Systems
SP - 411
EP - 422
BT - Smart Technologies for a Sustainable Future
A2 - Auer, Michael E.
A2 - Langmann, Reinhard
A2 - May, Dominik
A2 - Roos, Kim
PB - Springer
T2 - 21st International Conference on Smart Technologies & Education (STE-2024)
Y2 - 6 March 2024 through 8 March 2024
ER -