XR Research



Brewed for Industry
Served as Academic Publications
Rompapas, Damien Constantine; Campbell, James; Ta, Vincent; Cassinelli, Alvaro
Project Ariel: An Open Source Augmented Reality Headset for Industrial Applications Conference
2021.
@conference{Rompapas2018d,
title = {Project Ariel: An Open Source Augmented Reality Headset for Industrial Applications},
author = {Damien Constantine Rompapas and James Campbell and Vincent Ta and Alvaro Cassinelli},
url = {https://dl.acm.org/doi/abs/10.1145/3460418.3479359https://beer-labs.net/wp-content/uploads/2021/12/Project_Ariel__An_Open_Source_Augmented_Reality_Headset_for_Industrial_Applications.pdf},
year = {2021},
date = {2021-09-21},
urldate = {2021-09-21},
abstract = {Some of the biggest challenges in applying Augmented Reality (AR) technologies to the industry floor are in the form factor, and safety requirements of the head worn display. This includes alleviating issues such as peripheral view occlusion, and adaptation to personal protective equipment. In this work we present the design of Project Ariel, an Open Source 3D printable display specifically designed for use in industrial environments. It is our hope that with this technology, the average tradesman can utilize the powerful visualizations AR has to offer, significantly improving their daily work flow.
KEYWORDS
Augmented Reality; Headset Design; Optical See-Through; Open Source;},
keywords = {},
pubstate = {published},
tppubtype = {conference}
}
Some of the biggest challenges in applying Augmented Reality (AR) technologies to the industry floor are in the form factor, and safety requirements of the head worn display. This includes alleviating issues such as peripheral view occlusion, and adaptation to personal protective equipment. In this work we present the design of Project Ariel, an Open Source 3D printable display specifically designed for use in industrial environments. It is our hope that with this technology, the average tradesman can utilize the powerful visualizations AR has to offer, significantly improving their daily work flow.
KEYWORDS
Augmented Reality; Headset Design; Optical See-Through; Open Source;
KEYWORDS
Augmented Reality; Headset Design; Optical See-Through; Open Source;
Rompapas, Damien Constantine; Quiros, Daniel Flores; Rodda, Charlton; Brown, Bryan Christopher; Zerkin, Noah Benjamin; Cassinelli, Alvaro
Project Esky: an Open Source Software Framework for High Fidelity Extended Reality Conference
2021.
@conference{Rompapas2018c,
title = {Project Esky: an Open Source Software Framework for High Fidelity Extended Reality},
author = {Damien Constantine Rompapas and Daniel Flores Quiros and Charlton Rodda and Bryan Christopher Brown and Noah Benjamin Zerkin and Alvaro Cassinelli},
url = {https://beer-labs.net/wp-content/uploads/2021/12/Esky__ISMAR_Submission.pdf},
year = {2021},
date = {2021-05-08},
urldate = {2021-05-08},
abstract = {This demonstration showcases a complete Open-Source Augmented Reality (AR) modular platform capable of high fidelity natural handinteractions with virtual content, high field of view, and spatial mapping for environment interactions. We do this via several live desktop demonstrations. Finally, included in this demonstration is a completed open source schematic, allowing anyone interested in utilizing our proposed platform to engage with high fidelity AR. It is our hope that the work described in this demo will be a stepping stone towards bringing high-fidelity AR content to researchers and commodity users alike.
Keywords: Augmented Reality, High Fidelity, Collaborative Augmented Reality, Open Source Platforms},
keywords = {},
pubstate = {published},
tppubtype = {conference}
}
This demonstration showcases a complete Open-Source Augmented Reality (AR) modular platform capable of high fidelity natural handinteractions with virtual content, high field of view, and spatial mapping for environment interactions. We do this via several live desktop demonstrations. Finally, included in this demonstration is a completed open source schematic, allowing anyone interested in utilizing our proposed platform to engage with high fidelity AR. It is our hope that the work described in this demo will be a stepping stone towards bringing high-fidelity AR content to researchers and commodity users alike.
Keywords: Augmented Reality, High Fidelity, Collaborative Augmented Reality, Open Source Platforms
Keywords: Augmented Reality, High Fidelity, Collaborative Augmented Reality, Open Source Platforms
Rompapas, Damien; Sandor, Christian; Plopski, Alexander; Daniel Saakes, Dong Hyeok Yun; Taketomi, Takafumi; Kato, Hirokazu
Holoroyale: A Large Scale High Fidelity Augmented Reality Game Conference
2018, ISBN: 978-1-4503-5949-8/18/10.
@conference{Rompapas2018,
title = {Holoroyale: A Large Scale High Fidelity Augmented Reality Game},
author = {Damien Rompapas and Christian Sandor and Alexander Plopski and Daniel Saakes, Dong Hyeok Yun and Takafumi Taketomi and Hirokazu Kato},
url = {https://beer-labs.net/wp-content/uploads/2021/12/HoloRoyale___UIST.pdf},
doi = {10.1145/3266037.3271637},
isbn = {978-1-4503-5949-8/18/10},
year = {2018},
date = {2018-10-11},
urldate = {2018-10-11},
abstract = {INTRODUCTION
Recent years saw an explosion in Augmented Reality (AR) experiences for consumers. These experiences can be classified based on the scale of the interactive area (room vs city/global scale) , or the fidelity of the experience (high vs low) [4]. Experiences that target large areas, such as campus or world scale [7, 6], commonly have only rudimentary interactions with the physical world, and suffer from registration errors and jitter. We classify these experiences as large scale and low fidelity. On the other hand, various room sized experiences [5, 8] feature realistic interaction of virtual content with the real world. We classify these experiences as small scale and high fidelity.
Our work is the first to explore the domain of large scale high fidelity (LSHF) AR experiences. We build upon the small scale high fidelity capabilities of the Microsoft HoloLens to allow LSHF interactions. We demonstrate the capabilities of our system with a game specifically designed for LSHF
interactions, handling many challenges and limitations unique to the domain of LSHF AR through the game design.
Our contributions are twofold:
The lessons learned during the design and development of a system capable of LSHF AR interactions.
Identification of a set of reusable game elements specific to LSHF AR, including mechanisms for addressing spatio-temporal inconsistencies and crowd control. We believe our contributions will be fully applicable not only to games, but all LSHF AR experiences.},
keywords = {},
pubstate = {published},
tppubtype = {conference}
}
INTRODUCTION
Recent years saw an explosion in Augmented Reality (AR) experiences for consumers. These experiences can be classified based on the scale of the interactive area (room vs city/global scale) , or the fidelity of the experience (high vs low) [4]. Experiences that target large areas, such as campus or world scale [7, 6], commonly have only rudimentary interactions with the physical world, and suffer from registration errors and jitter. We classify these experiences as large scale and low fidelity. On the other hand, various room sized experiences [5, 8] feature realistic interaction of virtual content with the real world. We classify these experiences as small scale and high fidelity.
Our work is the first to explore the domain of large scale high fidelity (LSHF) AR experiences. We build upon the small scale high fidelity capabilities of the Microsoft HoloLens to allow LSHF interactions. We demonstrate the capabilities of our system with a game specifically designed for LSHF
interactions, handling many challenges and limitations unique to the domain of LSHF AR through the game design.
Our contributions are twofold:
The lessons learned during the design and development of a system capable of LSHF AR interactions.
Identification of a set of reusable game elements specific to LSHF AR, including mechanisms for addressing spatio-temporal inconsistencies and crowd control. We believe our contributions will be fully applicable not only to games, but all LSHF AR experiences.
Recent years saw an explosion in Augmented Reality (AR) experiences for consumers. These experiences can be classified based on the scale of the interactive area (room vs city/global scale) , or the fidelity of the experience (high vs low) [4]. Experiences that target large areas, such as campus or world scale [7, 6], commonly have only rudimentary interactions with the physical world, and suffer from registration errors and jitter. We classify these experiences as large scale and low fidelity. On the other hand, various room sized experiences [5, 8] feature realistic interaction of virtual content with the real world. We classify these experiences as small scale and high fidelity.
Our work is the first to explore the domain of large scale high fidelity (LSHF) AR experiences. We build upon the small scale high fidelity capabilities of the Microsoft HoloLens to allow LSHF interactions. We demonstrate the capabilities of our system with a game specifically designed for LSHF
interactions, handling many challenges and limitations unique to the domain of LSHF AR through the game design.
Our contributions are twofold:
The lessons learned during the design and development of a system capable of LSHF AR interactions.
Identification of a set of reusable game elements specific to LSHF AR, including mechanisms for addressing spatio-temporal inconsistencies and crowd control. We believe our contributions will be fully applicable not only to games, but all LSHF AR experiences.
Sorry, no publications matched your criteria.