Abstract
This paper proposes the addition of non-speech sounds to aid people who use scanning as their method of input. Scanning input is a temporal task; users have to press a switch when a cursor is over the required target. However, it is usually presented as a spatial task with the items to be scanned laid-out in a grid. Research has shown that for temporal tasks the auditory modality is often better than the visual. This paper investigates this by adding non-speech sound to a visual scanning system. It also shows how our natural abilities to perceive rhythms can be supported so that they can be used to aid the scanning process. Structured audio messages called Earcons were used for the sound output. The results from a preliminary investigation were favourable, indicating that the idea is feasible and further research should be undertaken.
Original language | English |
---|---|
Title of host publication | Assets '96 |
Subtitle of host publication | Proceedings of the second annual ACM conference on Assistive technologies |
Editors | Ephraim P. Glinert, David L. Jaffe |
Place of Publication | New York |
Publisher | Association for Computing Machinery ACM |
Pages | 10-14 |
ISBN (Print) | 978-0-89791-776-6 |
DOIs | |
Publication status | Published - 1996 |
MoE publication type | A4 Article in a conference publication |
Event | 2nd ACM/SIGCAPH Conference on Assistive Technologies (ASSETS'96) - Vancouver, Canada Duration: 11 Apr 1996 → 12 Apr 1996 |
Conference
Conference | 2nd ACM/SIGCAPH Conference on Assistive Technologies (ASSETS'96) |
---|---|
Country/Territory | Canada |
City | Vancouver |
Period | 11/04/96 → 12/04/96 |