Networked, real time translation of 3D mesh data to immersive virtual reality environments

Kevin Lesniak, Janis P. Terpenny, Conrad S. Tucker, Chimay Anumba, Sven G. Bilen

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Immersive virtual reality systems have the potential to transform the manner in which designers create prototypes and collaborate in teams. Using technologies such as the Oculus Rift or the HTC Vive, a designer can attain a sense of "presence" and "immersion" typically not experienced by traditional CAD-based platforms. However, one of the fundamental challenges of creating a high quality immersive virtual reality experience is actually creating the immersive virtual reality environment itself. Typically, designers spend a considerable amount of time manually designing virtual models that replicate physical, real world artifacts. While there exists the ability to import standard 3D models into these immersive virtual reality environments, these models are typically generic in nature and do not represent the designer's intent. To mitigate these challenges, the authors of this work propose the real time translation of physical objects into an immersive virtual reality environment using readily available RGB-D sensing systems and standard networking connections. The emergence of commercial, off-the shelf RGB-D sensing systems such as the Microsoft Kinect, have enabled the rapid 3D reconstruction of physical environments. The authors present a methodology that employs 3D mesh reconstruction algorithms and real time rendering techniques to capture physical objects in the real world and represent their 3D reconstruction in an immersive virtual reali1ty environment with which the user can then interact. A case study involving a commodity RGB-D sensor and multiple computers connected through standard TCP internet connections is presented to demonstrate the viability of the proposed methodology.

Original languageEnglish (US)
Title of host publication36th Computers and Information in Engineering Conference
PublisherAmerican Society of Mechanical Engineers (ASME)
Volume1B-2016
ISBN (Electronic)9780791850084
DOIs
StatePublished - Jan 1 2016
EventASME 2016 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, IDETC/CIE 2016 - Charlotte, United States
Duration: Aug 21 2016Aug 24 2016

Other

OtherASME 2016 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, IDETC/CIE 2016
CountryUnited States
CityCharlotte
Period8/21/168/24/16

Fingerprint

Virtual Reality
Virtual reality
Mesh
3D Reconstruction
Sensing
Real-time Rendering
Methodology
Reconstruction Algorithm
Virtual Environments
Immersion
Viability
Networking
3D Model
Standard Model
Prototype
Transform
Computer aided design
Sensor
Internet
Model

All Science Journal Classification (ASJC) codes

  • Mechanical Engineering
  • Computer Graphics and Computer-Aided Design
  • Computer Science Applications
  • Modeling and Simulation

Cite this

Lesniak, K., Terpenny, J. P., Tucker, C. S., Anumba, C., & Bilen, S. G. (2016). Networked, real time translation of 3D mesh data to immersive virtual reality environments. In 36th Computers and Information in Engineering Conference (Vol. 1B-2016). American Society of Mechanical Engineers (ASME). https://doi.org/10.1115/DETC2016-59762
Lesniak, Kevin ; Terpenny, Janis P. ; Tucker, Conrad S. ; Anumba, Chimay ; Bilen, Sven G. / Networked, real time translation of 3D mesh data to immersive virtual reality environments. 36th Computers and Information in Engineering Conference. Vol. 1B-2016 American Society of Mechanical Engineers (ASME), 2016.
@inproceedings{8e3f305de59046f2805c6331a746765b,
title = "Networked, real time translation of 3D mesh data to immersive virtual reality environments",
abstract = "Immersive virtual reality systems have the potential to transform the manner in which designers create prototypes and collaborate in teams. Using technologies such as the Oculus Rift or the HTC Vive, a designer can attain a sense of {"}presence{"} and {"}immersion{"} typically not experienced by traditional CAD-based platforms. However, one of the fundamental challenges of creating a high quality immersive virtual reality experience is actually creating the immersive virtual reality environment itself. Typically, designers spend a considerable amount of time manually designing virtual models that replicate physical, real world artifacts. While there exists the ability to import standard 3D models into these immersive virtual reality environments, these models are typically generic in nature and do not represent the designer's intent. To mitigate these challenges, the authors of this work propose the real time translation of physical objects into an immersive virtual reality environment using readily available RGB-D sensing systems and standard networking connections. The emergence of commercial, off-the shelf RGB-D sensing systems such as the Microsoft Kinect, have enabled the rapid 3D reconstruction of physical environments. The authors present a methodology that employs 3D mesh reconstruction algorithms and real time rendering techniques to capture physical objects in the real world and represent their 3D reconstruction in an immersive virtual reali1ty environment with which the user can then interact. A case study involving a commodity RGB-D sensor and multiple computers connected through standard TCP internet connections is presented to demonstrate the viability of the proposed methodology.",
author = "Kevin Lesniak and Terpenny, {Janis P.} and Tucker, {Conrad S.} and Chimay Anumba and Bilen, {Sven G.}",
year = "2016",
month = "1",
day = "1",
doi = "10.1115/DETC2016-59762",
language = "English (US)",
volume = "1B-2016",
booktitle = "36th Computers and Information in Engineering Conference",
publisher = "American Society of Mechanical Engineers (ASME)",

}

Lesniak, K, Terpenny, JP, Tucker, CS, Anumba, C & Bilen, SG 2016, Networked, real time translation of 3D mesh data to immersive virtual reality environments. in 36th Computers and Information in Engineering Conference. vol. 1B-2016, American Society of Mechanical Engineers (ASME), ASME 2016 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, IDETC/CIE 2016, Charlotte, United States, 8/21/16. https://doi.org/10.1115/DETC2016-59762

Networked, real time translation of 3D mesh data to immersive virtual reality environments. / Lesniak, Kevin; Terpenny, Janis P.; Tucker, Conrad S.; Anumba, Chimay; Bilen, Sven G.

36th Computers and Information in Engineering Conference. Vol. 1B-2016 American Society of Mechanical Engineers (ASME), 2016.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Networked, real time translation of 3D mesh data to immersive virtual reality environments

AU - Lesniak, Kevin

AU - Terpenny, Janis P.

AU - Tucker, Conrad S.

AU - Anumba, Chimay

AU - Bilen, Sven G.

PY - 2016/1/1

Y1 - 2016/1/1

N2 - Immersive virtual reality systems have the potential to transform the manner in which designers create prototypes and collaborate in teams. Using technologies such as the Oculus Rift or the HTC Vive, a designer can attain a sense of "presence" and "immersion" typically not experienced by traditional CAD-based platforms. However, one of the fundamental challenges of creating a high quality immersive virtual reality experience is actually creating the immersive virtual reality environment itself. Typically, designers spend a considerable amount of time manually designing virtual models that replicate physical, real world artifacts. While there exists the ability to import standard 3D models into these immersive virtual reality environments, these models are typically generic in nature and do not represent the designer's intent. To mitigate these challenges, the authors of this work propose the real time translation of physical objects into an immersive virtual reality environment using readily available RGB-D sensing systems and standard networking connections. The emergence of commercial, off-the shelf RGB-D sensing systems such as the Microsoft Kinect, have enabled the rapid 3D reconstruction of physical environments. The authors present a methodology that employs 3D mesh reconstruction algorithms and real time rendering techniques to capture physical objects in the real world and represent their 3D reconstruction in an immersive virtual reali1ty environment with which the user can then interact. A case study involving a commodity RGB-D sensor and multiple computers connected through standard TCP internet connections is presented to demonstrate the viability of the proposed methodology.

AB - Immersive virtual reality systems have the potential to transform the manner in which designers create prototypes and collaborate in teams. Using technologies such as the Oculus Rift or the HTC Vive, a designer can attain a sense of "presence" and "immersion" typically not experienced by traditional CAD-based platforms. However, one of the fundamental challenges of creating a high quality immersive virtual reality experience is actually creating the immersive virtual reality environment itself. Typically, designers spend a considerable amount of time manually designing virtual models that replicate physical, real world artifacts. While there exists the ability to import standard 3D models into these immersive virtual reality environments, these models are typically generic in nature and do not represent the designer's intent. To mitigate these challenges, the authors of this work propose the real time translation of physical objects into an immersive virtual reality environment using readily available RGB-D sensing systems and standard networking connections. The emergence of commercial, off-the shelf RGB-D sensing systems such as the Microsoft Kinect, have enabled the rapid 3D reconstruction of physical environments. The authors present a methodology that employs 3D mesh reconstruction algorithms and real time rendering techniques to capture physical objects in the real world and represent their 3D reconstruction in an immersive virtual reali1ty environment with which the user can then interact. A case study involving a commodity RGB-D sensor and multiple computers connected through standard TCP internet connections is presented to demonstrate the viability of the proposed methodology.

UR - http://www.scopus.com/inward/record.url?scp=85007564131&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85007564131&partnerID=8YFLogxK

U2 - 10.1115/DETC2016-59762

DO - 10.1115/DETC2016-59762

M3 - Conference contribution

AN - SCOPUS:85007564131

VL - 1B-2016

BT - 36th Computers and Information in Engineering Conference

PB - American Society of Mechanical Engineers (ASME)

ER -

Lesniak K, Terpenny JP, Tucker CS, Anumba C, Bilen SG. Networked, real time translation of 3D mesh data to immersive virtual reality environments. In 36th Computers and Information in Engineering Conference. Vol. 1B-2016. American Society of Mechanical Engineers (ASME). 2016 https://doi.org/10.1115/DETC2016-59762