Early-stage user experience design of the remote operation concept of the harbour’s reachstacker by exploiting eXtended Reality

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientific

Abstract

Abstract: The main objective of this abstract is to present a comprehensive approach involving user experience design, implementation, user testing, and iterative refinement of a Wizard-of-Oz interface via extended reality for remotely operating reach stackers, an off-highway type of vehicle used in logistic hubs. Proposed approach includes the main principles of the human centred design with several iteration loops. This abstract will present development and results of the first two iteration prototype.The use caseThe use case target is KALMAR’s remote controlled reachstacker, which is handling containers in a harbour. In the current setup, operators operate with their eyes on video screens and remotely control these machines. From the footage, they observe how the container approaches the landing point on e.g., a truck or another stack of containers and how it settles in place. Through such footage, it is extremely difficult to perceive depth, which is a must when positioning the container. Supporting the operator to perceive three-dimensionally and stay aware of items not visible in the cameras is a challenge when the machine must operate longer distance remotely. The first functional prototype and evaluationThe main approach of the prototype was to exploit the digital twin base XR prototype of the KALMAR’s reachstacker for the remote operation. The XR set-up was based on Varjo XR-3, which allows user to see the actual remote operation station and virtual representation of the reach stacker and harbour environment. The user controlled the reach stacker with a Logitech G29 steering wheel and a Logitech Extreme 3D joystick, and virtual environment with hand gestures. The first user evaluation was held at VTT’s XR lab with KALMAR personnel on 16.5.2023. There were four test subjects and they executed simple pick and place tasks. To assess the usability of the prototype, users were asked to fill in a System Usability Scale (SUS), observation and group discussion was also used. The first prototype achieved a SUS score of 59, which means usability was evaluated as marginal, but not acceptable, and following improvements were identified: (1) concept is working, but end-user is not willing to use head mounted display all the time, (2) more augmented information to actual screens and XR environment, and (3) controls should stay on physical device e.g., in tablet or in joystick’s buttons.The second functional prototype and evaluationThe second prototype focused on a simulated 360° camera view and embedded augmented content, which was highlighted several times during the first evaluation. Augmented reality content was based on several available sensor and other data. User was able to choose between four different camera views: 1) cabin view/driver’s perspective, 2) spreader top-down view, 3) rear view/counterweight, and 4) drone/bird’s eye view. Visualization was done with XR powerwall.The second user evaluation was held at VTT’s XR lab with KALMAR personnel on 30.11.2023. There were four test subjects and they executed simple pick and place tasks. The second prototype achieved a score of 70, which means usability was evaluated as acceptable.Based on two development cycles the XR environment combined with human- centric evaluation and design seems to be powerful method for the early-stage user experience design. In both sessions potential users are able to give valuable feedback to novel approaches for the remote operation concept. However, there are learning curve to exploit the novel XR UI and interaction concepts, especially gesture-based interaction.In the next phase, the most suitable features from the two iteration cycles will be selected and implemented to the third XR prototype. Also, some new features like haptics will be implemented. Based on the third human-centric iteration the first version of the actual remote operation station will be specified and implemented.
Original languageEnglish
Title of host publicationUsability and User Experience. AHFE (2024) International Conference.
ISBN (Electronic)978-1-964867-32-8
DOIs
Publication statusPublished - 2024
MoE publication typeB3 Non-refereed article in conference proceedings
Event15th International Conference on Applied Human Factors and Ergonomics (AHFE 2024) and the Affiliated Conferences - Université Côte d'Azur, Nice, France
Duration: 24 Jul 202427 Jul 2024
Conference number: 15

Publication series

SeriesAHFE International
Volume156
ISSN2771-0718

Conference

Conference15th International Conference on Applied Human Factors and Ergonomics (AHFE 2024) and the Affiliated Conferences
Abbreviated titleAHFE
Country/TerritoryFrance
CityNice
Period24/07/2427/07/24

Funding

This project has received funding from the European Union Horizon Europe research and innovation programme under grant agreement No 101092861. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European union or the European Commission. Neither the European Union nor the granting authority can be held responsible for them.

Keywords

  • UX
  • XR
  • HCD
  • remote control
  • logistics

Fingerprint

Dive into the research topics of 'Early-stage user experience design of the remote operation concept of the harbour’s reachstacker by exploiting eXtended Reality'. Together they form a unique fingerprint.

Cite this