BRADFORD SCHOLARS

    • Sign in
    View Item 
    •   Bradford Scholars
    • Engineering and Informatics
    • Engineering and Informatics Publications
    • View Item
    •   Bradford Scholars
    • Engineering and Informatics
    • Engineering and Informatics Publications
    • View Item
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of Bradford ScholarsCommunitiesAuthorsTitlesSubjectsPublication DateThis CollectionAuthorsTitlesSubjectsPublication Date

    My Account

    Sign in

    HELP

    Bradford Scholars FAQsCopyright Fact SheetPolicies Fact SheetDeposit Terms and ConditionsDigital Preservation Policy

    Statistics

    Most Popular ItemsStatistics by CountryMost Popular Authors

    Context-aware mixed reality: A learning-based framework for semantic-level interaction

    • CSV
    • RefMan
    • EndNote
    • BibTex
    • RefWorks
    Thumbnail
    View/Open
    Wan_et_al-2020-Computer_Graphics_Forum.pdf (2.228Mb)
    Download
    Publication date
    2020-02
    Author
    Chen, L.
    Tang, W.
    John, N.W.
    Wan, Tao Ruan
    Zhang, J.J.
    Keyword
    Interaction techniques
    Interaction
    Methods and applications‐computer games
    Methods and applications
    Augmented reality
    Virtual environments
    Rights
    © 2019 The Authors. Computer Graphics Forum published by Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.
    Peer-Reviewed
    Yes
    
    Metadata
    Show full item record
    Abstract
    Mixed reality (MR) is a powerful interactive technology for new types of user experience. We present a semantic‐based interactive MR framework that is beyond current geometry‐based approaches, offering a step change in generating high‐level context‐aware interactions. Our key insight is that by building semantic understanding in MR, we can develop a system that not only greatly enhances user experience through object‐specific behaviours, but also it paves the way for solving complex interaction design challenges. In this paper, our proposed framework generates semantic properties of the real‐world environment through a dense scene reconstruction and deep image understanding scheme. We demonstrate our approach by developing a material‐aware prototype system for context‐aware physical interactions between the real and virtual objects. Quantitative and qualitative evaluation results show that the framework delivers accurate and consistent semantic information in an interactive MR environment, providing effective real‐time semantic‐level interactions.
    URI
    http://hdl.handle.net/10454/17543
    Version
    Published version
    Citation
    Chen L, Tang W, John NW et al (2020) Context-aware mixed reality: A learning-based framework for semantic-level interaction. Computer Graphics Forum. 39(1): 484-496.
    Link to publisher’s version
    https://doi.org/10.1111/cgf.13887
    Type
    Article
    Collections
    Engineering and Informatics Publications

    entitlement

     
    DSpace software (copyright © 2002 - 2023)  DuraSpace
    Quick Guide | Contact Us
    Open Repository is a service operated by 
    Atmire NV
     

    Export search results

    The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

    By default, clicking on the export buttons will result in a download of the allowed maximum amount of items.

    To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

    After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.