Abstract
A light-weight, wearable, wireless gaze tracker with integrated
selection command source for human-computer interaction is introduced.
The prototype system combines head-mounted, video-based gaze tracking
with capacitive facial movement detection that enable multimodal
interaction by gaze pointing and making selections with facial gestures.
The system is targeted mainly to disabled people with limited mobility
over their hands. The hardware was made wireless to remove the need to
take off the device when moving away from the computer, and to allow
future use in more mobile contexts. The algorithms responsible for
determining the eye and head orientations to map gaze direction to
on-screen coordinates are presented together with the one to detect
movements from the measured capacitance signal. Point-and-click
experiments were conducted to assess the performance of the multimodal
system. The results show decent performance in laboratory and office
conditions. The overall point-and-click accuracy in the multimodal
experiments is comparable to the errors in previous research on
head-mounted, single modality gaze tracking that does not compensate for
changes in head orientation.
Original language | English |
---|---|
Pages (from-to) | 795-801 |
Journal | IEEE Transactions on Information Technology in Biomedicine |
Volume | 15 |
Issue number | 5 |
DOIs | |
Publication status | Published - 2011 |
MoE publication type | A1 Journal article-refereed |
Keywords
- assistive technology
- capacitive facial movement detection
- gaze tracking
- human-computer interaction
- multimodal interaction