Multimodal User Interface for Augmented Assembly

Sanni Siltanen, Mika Hakkarainen, Otto Korkalo, Tapio Salonen, Juha Sääski, Charles Woodward, Theofanis Kannetis, Manolis Perakakis, Alexandros Potamianos

    Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

    11 Citations (Scopus)


    In this paper, a multimodal system for augmented reality aided assembly work is designed and implemented. The multimodal interface allows for speech and gestural input. The system emulates a simplified assembly task in a factory. A 3D puzzle is used to study how to implement the augmented assembly system to a real setting in a factory. The system is used as a demonstrator and as a test-bed to evaluate different input modalities for augmented assembly setups. Preliminary system evaluation results are presented, the user experience is discussed, and some directions for future work are given.
    Original languageEnglish
    Title of host publicationProceedings
    Subtitle of host publicationIEEE 9th Workshop on Multimedia Signal Processing, MMSP 2007
    Place of PublicationPiscataway, NJ, USA
    PublisherIEEE Institute of Electrical and Electronic Engineers
    ISBN (Print)978-1-4244-1273-0, 978-1-4244-1274-7
    Publication statusPublished - 2007
    MoE publication typeA4 Article in a conference publication
    EventIEEE 9th Workshop on Multimedia Signal Processing, MMSP 2007 - Crete, Greece
    Duration: 1 Oct 20073 Oct 2007


    ConferenceIEEE 9th Workshop on Multimedia Signal Processing, MMSP 2007
    Abbreviated titleMMSP 2007


    Dive into the research topics of 'Multimodal User Interface for Augmented Assembly'. Together they form a unique fingerprint.

    Cite this