Mixed Reality (MR) and Augmented Reality (AR) create exciting opportunities to engage users in immersive experiences, resulting in natural human-computer interaction.
Many MR interactions are generated around a First-person Point of View (POV). In these cases, the user directs to the environment, which is digitally displayed either through a head-mounted display or a handheld computing device. One drawback of such conventional AR/MR platforms is that the experience is user-specific. Moreover, these platforms require the user to wear and/or hold an expensive device, which can be cumbersome and alter interaction techniques.
We create a solution for multi-user interactions in AR/MR, where a group can share the same augmented environment with any computer generated (CG) asset and interact in a shared story sequence through a third-person POV. Our approach is to instrument the environment leaving the user unburdened of any equipment, creating
a seamless walk-up-and-play experience. We demonstrate this technology in a series of vigne?es featuring humanoid animals.
Participants can not only see and hear these characters, they can also feel them on the bench through haptic feedback. Many of the characters also interact with users directly, either through speech or touch. In one vignette an elephant hands a participant a glowing orb. This demonstrates HCI in its simplest form: a person walks
up to a computer, and the computer hands the person an object.
We create a 3D reconstruction of a scene using a combination of the depth and color sensors on an o?-the-shelf Microsoft Kinect. To do this, we draw polygons using each point in the point cloud as a vertex, creating the appearance of a solid mesh. The mesh is then aligned to the RGB camera feed of the scene from the same Kinect. This alignment gives the mesh color, and completes a 3D reconstructed video feed.
There are several problems that arise with the 3D constructed feed. First, the monocular feed creates “depth shadows” in areas where there is no direct line-of-sight to the depth sensor. Second, the depth camera is laterally offset from the RGB camera (since they cannot physically occupy the same space) and therefore have slightly different viewing angles, creating further depth shadowing. The resulting data feed is sparse and cannot represent the whole scene (see Figure 3. To solve this, we align the 3D depth feed with the 2D RGB feed from the Kinect. By compositing the depth feed over a 2D backdrop, the system effectively masks these depth shadows, creating a seamless composite that can then be populated with 3D CG assets.
This mixed reality platform centers around the simple setting of a bench. The bench works in an novel way to constrain a few problems, such as identifying where a user is and subsequently inferring the direction of the user’s gaze (i.e., toward the screen). It creates a stage with a foreground and background, with the bench occupants in the middle ground. The bench also acts as a controller; the mixed reality experience won’t trigger until at least one person is detected sitting on the bench. Further, different
seating formations on the bench trigger different experiences.
Magic Bench is a custom Software and custom Hardware platform, necessitating a solution to bridge both aspects. Between the two exists a series of patches created in Cycling ’74 Max designed to convert signals sent from the game engine (via OSC) about the positions and states of objects in the scene, into the haptic sensations
felt on the bench. Haptic actuators are dynamically driven based on the location of animated content. The driving waveform for each actuator is designed according to the desired feel — in the current setup we can tweak base frequency, frequency of modulation, general amplitude, amplitude envelope, and three-dimensional position. These parameters can be manually tuned and/or adjusted in real time.
2 INSTALLATION OPTIONS
This piece can run as a traditional VR Village installation or as an autonomous piece in an unsuspecting area at SIGGRAPH — imagine sitting on a bench to rest your feet or check your email; in front of you is a screen showing a SIGGRAPH showreel. Once the system detects you, the content switches to a video feed of you,
creating a mirror effect. From there, an unexpected AR experience unfolds.
The Latest on: Holodeck
- Project Holodeck aims to produce best Star Trek holodeck yeton November 19, 2019 at 4:00 pm
Virtual reality is what many people (including John Carmack) see gaming will eventually become–wearing a headset and being completely immersed in the experience. One team at the University of Southern ...
- Star Trek: Discovery Away Mission puts four friends in a VR holodeckon November 19, 2019 at 9:39 am
Virtual Reality as we know it today is filled with many references to Star Trek's infamous Holodeck. The ability to step into a totally artificial world and have it feel real is something nerdy kids ...
- Sandbox VR Launches Its Latest Experience: Star Trek: Discovery - Away Missionon November 19, 2019 at 5:40 am
Originally inspired by Star Trek's iconic Holodeck, where environments are simulated in virtual reality, Sandbox VR brings that futuristic tech within reach at a location near you. The Holodeck is no ...
- Galaxy’s Child: How Not to Woo a Woman During a Crisison November 14, 2019 at 5:53 am
In case you didn’t remember Brahms, he helpfully summarizes the relevant portions of “Booby Trap” to Guinan. During the events of that episode, La Forge found it necessary to use the holodeck to ...
- Kratos to develop virtual prototyping holodeck for US Armyon October 24, 2019 at 5:00 pm
Kratos Defense & Security Solutions has received a contract for the development of a virtual prototyping holodeck (VPH) for the US Army. The applied research contract was awarded by Army’s Command, ...
- Kratos Awarded Multi-Million Dollar Applied Research Contract to Develop a Virtual Prototyping Holodeck (VPH) for U.S. Armyon October 23, 2019 at 5:03 am
October 23, 2019 08:00 ET | Source: Kratos Defense & Security Solutions, Inc. SAN DIEGO, Oct. 23, 2019 (GLOBE NEWSWIRE) -- Kratos Defense & Security Solutions, Inc. (Nasdaq: KTOS), a leading National ...
- MoPOP's Holodome Brings Us A Step Closer To A Real-World Holodeckon March 28, 2019 at 6:06 am
The idea of projecting a virtual experience into a physical environment has been with us at least since the days of Star Trek’s holodeck. Now a new attraction at Seattle’s Museum of Pop Culture (MoPOP ...
- A real step towards the virtual holodeckon March 25, 2019 at 5:01 pm
Patrick Stewart and Alfre Woodard in a Holodeck scene from Star Trek: First Contact. Image: Paramount Pictures (Fair Use) The Holodeck was a major feature of many episodes of Star Trek: the Next ...
- Elementary mathematics brings Star Trek's Holodeck closer to realityon March 25, 2019 at 6:42 am
For many years we have been hearing that holographic technology is one step closer to realizing Star Trek's famous Holodeck, a virtual reality stage that simulates any object in 3D as if they are real ...
- Building the Holodeck with Andreessen Horowitzon January 28, 2019 at 8:33 am
In short, Steve built a real holodeck. And a little over a year ago, I joined Sandbox VR as their Chief Product Officer. I 100% believe there exists no team in Silicon Valley that could have possibly ...
via Google News and Bing News