Real-time markerless Augmented Reality for Remote Handling system in bad viewing conditions

Z. Ziaei (Corresponding Author), A. Hahto, J. Mattila, Mikko Siuko, L. Semeraro

    Research output: Contribution to journalArticleScientificpeer-review

    15 Citations (Scopus)

    Abstract

    Remote Handling (RH) in harsh environments usually has to tackle the lack of sufficient visual feedback for the human operator due to the limited number of on-site cameras, the not optimized position of the cameras, the poor viewing angles, occlusion, failure, etc. Augmented Reality (AR) enables the user to perceive virtual computer-generated objects in a real scene. The most common goals usually include visibility enhancement and provision of extra information, such as positional data of various objects. The proposed AR system first recognizes and locates the markerless object by using a template based matching algorithm, and then augments the virtual model on top of the recognized item. The tracking algorithm is exploited for locating the object in a continuous sequence of frames. Conceptually, the template is found by computing the similarity between the template and the image frame, for all the relevant template poses (rotation and translation). As a case study, AR interface was displaying measured orientation and transformation of the Water Hydraulic Manipulator (WHMAN) Divertor preloading tool, in near real-time tracking. The bad viewing condition implies on the case when the view angle is such that the interesting features of the object are not in the field of view. The method in this paper was validated in concrete operational context at DTP2. The developed method proved to deliver robust positional and orientation information while augmenting and tracking the moving tool object.
    Original languageEnglish
    Pages (from-to)2033-2038
    Number of pages7
    JournalFusion Engineering and Design
    Volume86
    Issue number9-11
    DOIs
    Publication statusPublished - 2011
    MoE publication typeA1 Journal article-refereed

    Fingerprint

    Augmented reality
    Cameras
    Visibility
    Manipulators
    Hydraulics
    Feedback
    Water

    Keywords

    • Augmented reality
    • remote handling
    • template based matching
    • CAD-model
    • position
    • orientation

    Cite this

    Ziaei, Z. ; Hahto, A. ; Mattila, J. ; Siuko, Mikko ; Semeraro, L. / Real-time markerless Augmented Reality for Remote Handling system in bad viewing conditions. In: Fusion Engineering and Design. 2011 ; Vol. 86, No. 9-11. pp. 2033-2038.
    @article{e9806e19ed0d49a9b893d423265bfdf7,
    title = "Real-time markerless Augmented Reality for Remote Handling system in bad viewing conditions",
    abstract = "Remote Handling (RH) in harsh environments usually has to tackle the lack of sufficient visual feedback for the human operator due to the limited number of on-site cameras, the not optimized position of the cameras, the poor viewing angles, occlusion, failure, etc. Augmented Reality (AR) enables the user to perceive virtual computer-generated objects in a real scene. The most common goals usually include visibility enhancement and provision of extra information, such as positional data of various objects. The proposed AR system first recognizes and locates the markerless object by using a template based matching algorithm, and then augments the virtual model on top of the recognized item. The tracking algorithm is exploited for locating the object in a continuous sequence of frames. Conceptually, the template is found by computing the similarity between the template and the image frame, for all the relevant template poses (rotation and translation). As a case study, AR interface was displaying measured orientation and transformation of the Water Hydraulic Manipulator (WHMAN) Divertor preloading tool, in near real-time tracking. The bad viewing condition implies on the case when the view angle is such that the interesting features of the object are not in the field of view. The method in this paper was validated in concrete operational context at DTP2. The developed method proved to deliver robust positional and orientation information while augmenting and tracking the moving tool object.",
    keywords = "Augmented reality, remote handling, template based matching, CAD-model, position, orientation",
    author = "Z. Ziaei and A. Hahto and J. Mattila and Mikko Siuko and L. Semeraro",
    year = "2011",
    doi = "10.1016/j.fusengdes.2010.12.082",
    language = "English",
    volume = "86",
    pages = "2033--2038",
    journal = "Fusion Engineering and Design",
    issn = "0920-3796",
    publisher = "Elsevier",
    number = "9-11",

    }

    Real-time markerless Augmented Reality for Remote Handling system in bad viewing conditions. / Ziaei, Z. (Corresponding Author); Hahto, A.; Mattila, J.; Siuko, Mikko; Semeraro, L.

    In: Fusion Engineering and Design, Vol. 86, No. 9-11, 2011, p. 2033-2038.

    Research output: Contribution to journalArticleScientificpeer-review

    TY - JOUR

    T1 - Real-time markerless Augmented Reality for Remote Handling system in bad viewing conditions

    AU - Ziaei, Z.

    AU - Hahto, A.

    AU - Mattila, J.

    AU - Siuko, Mikko

    AU - Semeraro, L.

    PY - 2011

    Y1 - 2011

    N2 - Remote Handling (RH) in harsh environments usually has to tackle the lack of sufficient visual feedback for the human operator due to the limited number of on-site cameras, the not optimized position of the cameras, the poor viewing angles, occlusion, failure, etc. Augmented Reality (AR) enables the user to perceive virtual computer-generated objects in a real scene. The most common goals usually include visibility enhancement and provision of extra information, such as positional data of various objects. The proposed AR system first recognizes and locates the markerless object by using a template based matching algorithm, and then augments the virtual model on top of the recognized item. The tracking algorithm is exploited for locating the object in a continuous sequence of frames. Conceptually, the template is found by computing the similarity between the template and the image frame, for all the relevant template poses (rotation and translation). As a case study, AR interface was displaying measured orientation and transformation of the Water Hydraulic Manipulator (WHMAN) Divertor preloading tool, in near real-time tracking. The bad viewing condition implies on the case when the view angle is such that the interesting features of the object are not in the field of view. The method in this paper was validated in concrete operational context at DTP2. The developed method proved to deliver robust positional and orientation information while augmenting and tracking the moving tool object.

    AB - Remote Handling (RH) in harsh environments usually has to tackle the lack of sufficient visual feedback for the human operator due to the limited number of on-site cameras, the not optimized position of the cameras, the poor viewing angles, occlusion, failure, etc. Augmented Reality (AR) enables the user to perceive virtual computer-generated objects in a real scene. The most common goals usually include visibility enhancement and provision of extra information, such as positional data of various objects. The proposed AR system first recognizes and locates the markerless object by using a template based matching algorithm, and then augments the virtual model on top of the recognized item. The tracking algorithm is exploited for locating the object in a continuous sequence of frames. Conceptually, the template is found by computing the similarity between the template and the image frame, for all the relevant template poses (rotation and translation). As a case study, AR interface was displaying measured orientation and transformation of the Water Hydraulic Manipulator (WHMAN) Divertor preloading tool, in near real-time tracking. The bad viewing condition implies on the case when the view angle is such that the interesting features of the object are not in the field of view. The method in this paper was validated in concrete operational context at DTP2. The developed method proved to deliver robust positional and orientation information while augmenting and tracking the moving tool object.

    KW - Augmented reality

    KW - remote handling

    KW - template based matching

    KW - CAD-model

    KW - position

    KW - orientation

    U2 - 10.1016/j.fusengdes.2010.12.082

    DO - 10.1016/j.fusengdes.2010.12.082

    M3 - Article

    VL - 86

    SP - 2033

    EP - 2038

    JO - Fusion Engineering and Design

    JF - Fusion Engineering and Design

    SN - 0920-3796

    IS - 9-11

    ER -