Abstract
Cognitive readiness is important to face everyday challenges. Automatic detection of weakened cognitive readiness could be beneficial in many fields where high human performance is necessary e.g. driving.
Changes in the cognitive state are reflected in various biosignals and can be detected with the help of machine learning methods. Eyes are the main instrument that allows humans to explore the world around producing a number of signals potentially usefull for cognitive states detection. Nevertheless, there is a lack of studies exploring the use of eye metrics in machine learning for cognitive state detection.
This study explores and evaluates the use of eye metrics in classification of driving simulator data. The thesis demonstrated classification procedure for distinguishing between non-crash, before-crash, aftercrash, and crash events in a driving simulator game called CogCarSim.
A set of potentially discriminative features from electro-oculography (EOG) and video-oculography (VOG) signals were extracted: 7 pupil diameter features, 6 gaze point features, 11 saccade and fixation features from the VOG signal, 10 blink features from the EOG signal.
Based on extracted features a set of derivative features were generated. In the suggested approach, several machine learning methods were tested with class balancing techniques and normalization. The best result 69.94% of balanced accuracy was achieved by using the personalized Extreme Gradient Boosting model. The features extracted from pupil and gaze points were found to be the most distinguishing and contributed to the overall performance by 13.78% and 6.31% respectively.
The classification performance was increased to 75.84% utilizing both gameplay and eye-based features. While the model trained only on gameplay features performed on par with the proposed model and gave 68.52%.
The promising results of the study suggest that the selected approach could serve as a basis for future research such as real-time cognitive state detection and collision prediction.
Changes in the cognitive state are reflected in various biosignals and can be detected with the help of machine learning methods. Eyes are the main instrument that allows humans to explore the world around producing a number of signals potentially usefull for cognitive states detection. Nevertheless, there is a lack of studies exploring the use of eye metrics in machine learning for cognitive state detection.
This study explores and evaluates the use of eye metrics in classification of driving simulator data. The thesis demonstrated classification procedure for distinguishing between non-crash, before-crash, aftercrash, and crash events in a driving simulator game called CogCarSim.
A set of potentially discriminative features from electro-oculography (EOG) and video-oculography (VOG) signals were extracted: 7 pupil diameter features, 6 gaze point features, 11 saccade and fixation features from the VOG signal, 10 blink features from the EOG signal.
Based on extracted features a set of derivative features were generated. In the suggested approach, several machine learning methods were tested with class balancing techniques and normalization. The best result 69.94% of balanced accuracy was achieved by using the personalized Extreme Gradient Boosting model. The features extracted from pupil and gaze points were found to be the most distinguishing and contributed to the overall performance by 13.78% and 6.31% respectively.
The classification performance was increased to 75.84% utilizing both gameplay and eye-based features. While the model trained only on gameplay features performed on par with the proposed model and gave 68.52%.
The promising results of the study suggest that the selected approach could serve as a basis for future research such as real-time cognitive state detection and collision prediction.
Original language | English |
---|---|
Title of host publication | Mind and Matter: Conversations Across Disciplines |
Subtitle of host publication | Programme and Abstracts |
Publisher | University of Helsinki |
Publication status | Published - 6 Jun 2023 |
MoE publication type | D3 Professional conference proceedings |
Event | Mind and Matter 2023: Conversations across disciplines - Metsätalo, University of Helsinki, Helsinki, Finland Duration: 6 Jun 2023 → 8 Jun 2023 https://www.helsinki.fi/en/conferences/mind-and-matter-2023/programme |
Conference
Conference | Mind and Matter 2023: Conversations across disciplines |
---|---|
Country/Territory | Finland |
City | Helsinki |
Period | 6/06/23 → 8/06/23 |
Internet address |