Touch n' Tag: Digital annotation of physical objects with voice tagging

Antti Konttila, Marja Harjumaa, Salla Muuraiskangas, Mikko Jokela, Minna Isomursu

    Research output: Contribution to journalArticleScientificpeer-review

    5 Citations (Scopus)


    Purpose - This article aims to explore the possibilities and use of a mobile technology-supported audio annotation system that can be used for attaching free-formatted audio annotations to physical objects. The solution can help visually impaired people to identify objects and associate additional information with these objects. Design/methodology/approach - A human-centred design approach was adopted in the system's development and potential end-users were involved in the development process. In order to evaluate the emerging use cases, as well as the usefulness and usability of the application, a qualitative field trial was conducted with ten visually impaired or blind users. Findings - The findings show that visually impaired users learned to use the application easily and found it easy and robust to use. Most users responded positively towards the idea of tagging items with their own voice messages. Some users found the technology very useful and saw many possibilities for using it in the future. The most common targets for tagging were food items; however, some users had difficulties in integrating the solution with their everyday practices.
    Original languageEnglish
    Pages (from-to)24-37
    JournalJournal of Assistive Technologies
    Issue number1
    Publication statusPublished - 2012
    MoE publication typeA1 Journal article-refereed


    • Audio
    • digital technology
    • identifity
    • mobile phone
    • mobile technology
    • NFC
    • tag
    • visually impaired


    Dive into the research topics of 'Touch n' Tag: Digital annotation of physical objects with voice tagging'. Together they form a unique fingerprint.

    Cite this