Real-time occlusion between real and digital objects in augmented reality

Kevin Lesniak, Conrad S. Tucker

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

The method presented in this work reduces the frequency of virtual objects incorrectly occluding real-world objects in Augmented Reality (AR) applications. Current AR rendering methods cannot properly represent occlusion between real and virtual objects because the objects are not represented in a common coordinate system. These occlusion errors can lead users to have an incorrect perception of the environment around them when using an AR application, namely not knowing a real-world object is present due to a virtual object incorrectly occluding it and incorrect perception of depth or distance by the user due to incorrect occlusions. The authors of this paper present a method that brings both real-world and virtual objects into a common coordinate system so that distant virtual objects do not obscure nearby real-world objects in an AR application. This method captures and processes RGB-D data in real-time, allowing the method to be used in a variety of environments and scenarios. A case study shows the effectiveness and usability of the proposed method to correctly occlude real-world and virtual objects and provide a more realistic representation of the combined real and virtual environments in an AR application. The results of the case study show that the proposed method can detect at least 20 real-world objects with potential to be incorrectly occluded while processing and fixing occlusion errors at least 5 times per second.

Original languageEnglish (US)
Title of host publication38th Computers and Information in Engineering Conference
PublisherAmerican Society of Mechanical Engineers (ASME)
ISBN (Electronic)9780791851739
DOIs
StatePublished - Jan 1 2018
EventASME 2018 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, IDETC/CIE 2018 - Quebec City, Canada
Duration: Aug 26 2018Aug 29 2018

Publication series

NameProceedings of the ASME Design Engineering Technical Conference
Volume1B-2018

Other

OtherASME 2018 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, IDETC/CIE 2018
CountryCanada
CityQuebec City
Period8/26/188/29/18

Fingerprint

Augmented reality
Augmented Reality
Occlusion
Real-time
Virtual reality
Object
Processing
Virtual Environments
Rendering
Usability

All Science Journal Classification (ASJC) codes

  • Mechanical Engineering
  • Computer Graphics and Computer-Aided Design
  • Computer Science Applications
  • Modeling and Simulation

Cite this

Lesniak, K., & Tucker, C. S. (2018). Real-time occlusion between real and digital objects in augmented reality. In 38th Computers and Information in Engineering Conference (Proceedings of the ASME Design Engineering Technical Conference; Vol. 1B-2018). American Society of Mechanical Engineers (ASME). https://doi.org/10.1115/DETC201886346
Lesniak, Kevin ; Tucker, Conrad S. / Real-time occlusion between real and digital objects in augmented reality. 38th Computers and Information in Engineering Conference. American Society of Mechanical Engineers (ASME), 2018. (Proceedings of the ASME Design Engineering Technical Conference).
@inproceedings{ddc0ea448f6d41a983307b804435b11a,
title = "Real-time occlusion between real and digital objects in augmented reality",
abstract = "The method presented in this work reduces the frequency of virtual objects incorrectly occluding real-world objects in Augmented Reality (AR) applications. Current AR rendering methods cannot properly represent occlusion between real and virtual objects because the objects are not represented in a common coordinate system. These occlusion errors can lead users to have an incorrect perception of the environment around them when using an AR application, namely not knowing a real-world object is present due to a virtual object incorrectly occluding it and incorrect perception of depth or distance by the user due to incorrect occlusions. The authors of this paper present a method that brings both real-world and virtual objects into a common coordinate system so that distant virtual objects do not obscure nearby real-world objects in an AR application. This method captures and processes RGB-D data in real-time, allowing the method to be used in a variety of environments and scenarios. A case study shows the effectiveness and usability of the proposed method to correctly occlude real-world and virtual objects and provide a more realistic representation of the combined real and virtual environments in an AR application. The results of the case study show that the proposed method can detect at least 20 real-world objects with potential to be incorrectly occluded while processing and fixing occlusion errors at least 5 times per second.",
author = "Kevin Lesniak and Tucker, {Conrad S.}",
year = "2018",
month = "1",
day = "1",
doi = "10.1115/DETC201886346",
language = "English (US)",
series = "Proceedings of the ASME Design Engineering Technical Conference",
publisher = "American Society of Mechanical Engineers (ASME)",
booktitle = "38th Computers and Information in Engineering Conference",

}

Lesniak, K & Tucker, CS 2018, Real-time occlusion between real and digital objects in augmented reality. in 38th Computers and Information in Engineering Conference. Proceedings of the ASME Design Engineering Technical Conference, vol. 1B-2018, American Society of Mechanical Engineers (ASME), ASME 2018 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, IDETC/CIE 2018, Quebec City, Canada, 8/26/18. https://doi.org/10.1115/DETC201886346

Real-time occlusion between real and digital objects in augmented reality. / Lesniak, Kevin; Tucker, Conrad S.

38th Computers and Information in Engineering Conference. American Society of Mechanical Engineers (ASME), 2018. (Proceedings of the ASME Design Engineering Technical Conference; Vol. 1B-2018).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Real-time occlusion between real and digital objects in augmented reality

AU - Lesniak, Kevin

AU - Tucker, Conrad S.

PY - 2018/1/1

Y1 - 2018/1/1

N2 - The method presented in this work reduces the frequency of virtual objects incorrectly occluding real-world objects in Augmented Reality (AR) applications. Current AR rendering methods cannot properly represent occlusion between real and virtual objects because the objects are not represented in a common coordinate system. These occlusion errors can lead users to have an incorrect perception of the environment around them when using an AR application, namely not knowing a real-world object is present due to a virtual object incorrectly occluding it and incorrect perception of depth or distance by the user due to incorrect occlusions. The authors of this paper present a method that brings both real-world and virtual objects into a common coordinate system so that distant virtual objects do not obscure nearby real-world objects in an AR application. This method captures and processes RGB-D data in real-time, allowing the method to be used in a variety of environments and scenarios. A case study shows the effectiveness and usability of the proposed method to correctly occlude real-world and virtual objects and provide a more realistic representation of the combined real and virtual environments in an AR application. The results of the case study show that the proposed method can detect at least 20 real-world objects with potential to be incorrectly occluded while processing and fixing occlusion errors at least 5 times per second.

AB - The method presented in this work reduces the frequency of virtual objects incorrectly occluding real-world objects in Augmented Reality (AR) applications. Current AR rendering methods cannot properly represent occlusion between real and virtual objects because the objects are not represented in a common coordinate system. These occlusion errors can lead users to have an incorrect perception of the environment around them when using an AR application, namely not knowing a real-world object is present due to a virtual object incorrectly occluding it and incorrect perception of depth or distance by the user due to incorrect occlusions. The authors of this paper present a method that brings both real-world and virtual objects into a common coordinate system so that distant virtual objects do not obscure nearby real-world objects in an AR application. This method captures and processes RGB-D data in real-time, allowing the method to be used in a variety of environments and scenarios. A case study shows the effectiveness and usability of the proposed method to correctly occlude real-world and virtual objects and provide a more realistic representation of the combined real and virtual environments in an AR application. The results of the case study show that the proposed method can detect at least 20 real-world objects with potential to be incorrectly occluded while processing and fixing occlusion errors at least 5 times per second.

UR - http://www.scopus.com/inward/record.url?scp=85056895505&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85056895505&partnerID=8YFLogxK

U2 - 10.1115/DETC201886346

DO - 10.1115/DETC201886346

M3 - Conference contribution

AN - SCOPUS:85056895505

T3 - Proceedings of the ASME Design Engineering Technical Conference

BT - 38th Computers and Information in Engineering Conference

PB - American Society of Mechanical Engineers (ASME)

ER -

Lesniak K, Tucker CS. Real-time occlusion between real and digital objects in augmented reality. In 38th Computers and Information in Engineering Conference. American Society of Mechanical Engineers (ASME). 2018. (Proceedings of the ASME Design Engineering Technical Conference). https://doi.org/10.1115/DETC201886346