System and method for supporting synchronous and asynchronous augmented reality functionalities

Seppo Valli (Inventor), Pekka Siltanen (Inventor)

    Research output: PatentPatent application

    Abstract

    Augmented reality (AR) telepresence systems supporting synchronous and asynchronous augmented reality functionalities are disclosed. One embodiment takes the form of a method that comprises: receiving 3D model information of a first real-world space during a first interactive augmented reality (AR) session anchored in the first real-world space; subsequent to the termination of the first AR session: rendering a representation of the first real-world space based upon the 3D model information received during the first interactive AR session, receiving information regarding user input of the placement of a virtual object relative to the 3D model information, and rendering, during a second interactive AR session anchored in the first real-world space, the virtual object at a position corresponding to the information regarding user input of the placement of the virtual object relative to the 3D model information.
    Original languageEnglish
    Patent numberWO17177019A1
    Priority date8/04/16
    Filing date6/04/17
    Publication statusPublished - 12 Oct 2017
    MoE publication typeNot Eligible

    Fingerprint

    Augmented reality

    Cite this

    @misc{d372a96079ce4902a5cc37f1574eacdf,
    title = "System and method for supporting synchronous and asynchronous augmented reality functionalities",
    abstract = "Augmented reality (AR) telepresence systems supporting synchronous and asynchronous augmented reality functionalities are disclosed. One embodiment takes the form of a method that comprises: receiving 3D model information of a first real-world space during a first interactive augmented reality (AR) session anchored in the first real-world space; subsequent to the termination of the first AR session: rendering a representation of the first real-world space based upon the 3D model information received during the first interactive AR session, receiving information regarding user input of the placement of a virtual object relative to the 3D model information, and rendering, during a second interactive AR session anchored in the first real-world space, the virtual object at a position corresponding to the information regarding user input of the placement of the virtual object relative to the 3D model information.",
    author = "Seppo Valli and Pekka Siltanen",
    year = "2017",
    month = "10",
    day = "12",
    language = "English",
    type = "Patent",
    note = "WO17177019A1",

    }

    TY - PAT

    T1 - System and method for supporting synchronous and asynchronous augmented reality functionalities

    AU - Valli, Seppo

    AU - Siltanen, Pekka

    PY - 2017/10/12

    Y1 - 2017/10/12

    N2 - Augmented reality (AR) telepresence systems supporting synchronous and asynchronous augmented reality functionalities are disclosed. One embodiment takes the form of a method that comprises: receiving 3D model information of a first real-world space during a first interactive augmented reality (AR) session anchored in the first real-world space; subsequent to the termination of the first AR session: rendering a representation of the first real-world space based upon the 3D model information received during the first interactive AR session, receiving information regarding user input of the placement of a virtual object relative to the 3D model information, and rendering, during a second interactive AR session anchored in the first real-world space, the virtual object at a position corresponding to the information regarding user input of the placement of the virtual object relative to the 3D model information.

    AB - Augmented reality (AR) telepresence systems supporting synchronous and asynchronous augmented reality functionalities are disclosed. One embodiment takes the form of a method that comprises: receiving 3D model information of a first real-world space during a first interactive augmented reality (AR) session anchored in the first real-world space; subsequent to the termination of the first AR session: rendering a representation of the first real-world space based upon the 3D model information received during the first interactive AR session, receiving information regarding user input of the placement of a virtual object relative to the 3D model information, and rendering, during a second interactive AR session anchored in the first real-world space, the virtual object at a position corresponding to the information regarding user input of the placement of the virtual object relative to the 3D model information.

    M3 - Patent

    M1 - WO17177019A1

    Y2 - 2017/04/06

    ER -