Exploring driving simulator data classification base on eye metrics

Evgenii Rudakov, Kati Pettersson, Benjamin Cowley, Jani Mäntyjärvi

Research output: Chapter in Book/Report/Conference proceedingConference abstract in proceedingsProfessional

Abstract

Cognitive readiness is important to face everyday challenges. Automatic detection of weakened cognitive readiness could be beneficial in many fields where high human performance is necessary e.g. driving.

Changes in the cognitive state are reflected in various biosignals and can be detected with the help of machine learning methods. Eyes are the main instrument that allows humans to explore the world around producing a number of signals potentially usefull for cognitive states detection. Nevertheless, there is a lack of studies exploring the use of eye metrics in machine learning for cognitive state detection.

This study explores and evaluates the use of eye metrics in classification of driving simulator data. The thesis demonstrated classification procedure for distinguishing between non-crash, before-crash, aftercrash, and crash events in a driving simulator game called CogCarSim.

A set of potentially discriminative features from electro-oculography (EOG) and video-oculography (VOG) signals were extracted: 7 pupil diameter features, 6 gaze point features, 11 saccade and fixation features from the VOG signal, 10 blink features from the EOG signal.

Based on extracted features a set of derivative features were generated. In the suggested approach, several machine learning methods were tested with class balancing techniques and normalization. The best result 69.94% of balanced accuracy was achieved by using the personalized Extreme Gradient Boosting model. The features extracted from pupil and gaze points were found to be the most distinguishing and contributed to the overall performance by 13.78% and 6.31% respectively.
The classification performance was increased to 75.84% utilizing both gameplay and eye-based features. While the model trained only on gameplay features performed on par with the proposed model and gave 68.52%.

The promising results of the study suggest that the selected approach could serve as a basis for future research such as real-time cognitive state detection and collision prediction.
Original languageEnglish
Title of host publicationMind and Matter: Conversations Across Disciplines
Subtitle of host publicationProgramme and Abstracts
PublisherUniversity of Helsinki
Publication statusPublished - 6 Jun 2023
MoE publication typeD3 Professional conference proceedings
EventMind and Matter 2023: Conversations across disciplines - Metsätalo, University of Helsinki, Helsinki, Finland
Duration: 6 Jun 20238 Jun 2023
https://www.helsinki.fi/en/conferences/mind-and-matter-2023/programme

Conference

ConferenceMind and Matter 2023: Conversations across disciplines
Country/TerritoryFinland
CityHelsinki
Period6/06/238/06/23
Internet address

Fingerprint

Dive into the research topics of 'Exploring driving simulator data classification base on eye metrics'. Together they form a unique fingerprint.

Cite this