Design Prototype and Inside-out Tracking – LPVIZ (Part 2)

LPVIZ Prototype Industrial Design

This post is a follow-up to the introduction of our augmented reality (AR) headset LPVIZ. See our previous post here.

For the past two months the LPVIZ team has been working hard to improve our initial prototype. We have enhanced the device’s appearance and optimized it ergonomically. My colleague Seeon Mitchel has made draft 3D prints of the design that he has been planning for the initial release of LPVIZ. The results are looking excellent (Figure 1 & 2).

The ring design for fixing the unit to the user’s head feels comfortable. Even for longer usage duration the unit does not cause fatigue to the neck. See below two photos of the current functional prototype with the newly printed shell.

Figure 1, 2 – The fully functional LPVIZ design prototype

Inside-out Tracking and Gesture Recognition

The latest LPVIZ prototype features a built-in stereo camera. We are using the excellent Rigel module by the company UltraLeap that allows us to, at the same time, run a SLAM (simultaneous localization and mapping) algorithm and UltraLeap’s hand tracking.

Using the Rigel’s stereo camera, my colleague Thomas Hauth has developed a state-of-the-art inside-out tracking algorithm that allows the headset to be used inside a vehicle, even if no special cameras are installed. The video (Figure 3) below shows the fundamental functionality of the algorithm.

Figure 3 – The video shows the fundamental functionality of the LPSLAM inside-out tracking algorithm

It is important to note that this will not be a full replacement for ART outside-in tracking inside the vehicle. ART’s tracking engine is more accurate and robust under difficult lighting conditions. Still, our purpose is to also serve customers that have a smaller budget or no possibility to install additional equipment inside their vehicle.

Thomas wearing LPVIZ

AR HMD for In-Car Applications – LPVIZ (Part 1)

What is In-Vehicle AR

This article describes our first steps in the development of an AR HMD for in-car, aerospace and naval applications.

Over several years we have developed our LPVR middleware. In the first version the purpose of this middleware was to enable location-based VR with a combination of optical and IMU-based headset tracking. Building on this foundation we extended the system to work as a tracking solution for transportation platforms such as cars, ships or airplanes (Figure 1).

In contrast to stationary applications where an IMU is sufficient to track the rotations of an HMD, in the in-vehicle use-case, an additional IMU needs to be fixed to the vehicle and the information from this sensor needs to become part of the sensor fusion. We realized this with our LPVR-DUO tracking system.

Applying this middleware to existing augmented reality headsets on the market turned out to be challenging. Most AR HMDs use their own proprietary tracking technology that is only suitable for stationary use-cases, but doesn’t work in moving vehicles. Accessing such a tracking pipeline in order to extend it with our sensor fusion is usually not possible.

Illustration of In-car VR Installation

Figure 1 – Principle of in-car AR/VR as implemented with LPVR-DUO

Applications

There are a large number of applications for in-car augmented reality ranging from B2B use-cases for design and development to consumer-facing scenarios. A few are listed in the illustration below (Figure 2).

AR applications in a car

Figure 2 – In-car AR use cases range from a simple virtual dashboard to interactive e-commerce applications. The “camera pass-through” enables the driver to virtually look through the car to see objects otherwise occluded by the car chassis.

HMD Specifications

For this reason, we decided to start the development of LPVIZ, an AR HMD dedicated to in-vehicle applications. This AR HMD for in-car, aerospace and naval applications is to represent the requirements of our customers as closely as possible:

  • Strong optical engine with good FOV (LUMUS waveguides), unobstructed lateral vision (safety), low persistence and high refresh rate
  • System satisfies all requirements for immersive AR head tracking (pose prediction, head motion model, late latching, asynchronous timewarp etc.)
  • HMD is thethered to computing unit in vehicle by a thin VirtualLink cable
  • Computing unit is compact, but powerful enough to run SteamVR and thus supports a large range of software applications
  • Options to use either outside-in or inside-out optical tracking inside the vehicle, as well as LeapMotion hand tracking

In-Car HMD Hardware Prototype Development

We have recently created the first prototype of LPVIZ, with hardware development still in a very early stage, but enough to demonstrate our core functionality and use-case well.

Thomas wearing LPVIZ

Figure 3 – Tracking of LPVIZ works based on our LPVR-DUO technology making use of ART outside-in tracking and our LPMS-CURS2 IMU module. This image shows Dr. Thomas Hauth performing an optical-see-through (OST) calibration.

Figure 4 – The LPVIZ prototype is powered by a LUMUS optical engine. This waveguide-based technology has excellent optical characteristics, perfectly suitable for our use-case.

Work in Progress

As you can see from the prototype images, our hardware system is still very much in an alpha stadium. Nevertheless we think it shows the capabilities of our technology very well and points in the right direction. In the next hardware version that will already be close to a release model, we will reduce the size of the device, applying the points below:

  • Use active marker LEDs instead of large passive marker balls OR inside-out tracking
  • Collect all electronics components on one compact electronics board, with only one VirtualLink connector
  • Create a compact housing, with a glasses-like fixture instead of a VR-style ring mount (Figure 5)

Figure 5 – First draft of a CAD design for the housing of the LPVIZ release version

LPVR Middleware a Full Solution for AR / VR

Introducing LPVR Middleware

Building on the technology we developed for our IMU sensors and large scale VR tracking systems, we have created a full motion tracking and rendering pipeline for virtual reality (VR) and augmented reality (AR) applications.

The LPVR middleware is a full solution for AR / VR that enables headset manufacturers to easily create a state-of-the-art visualization pipeline customized to their product. Specifically our solution offers the following features:

    • Flexible zero-latency tracking adaptable to any combination of IMU and optical tracking
    • Rendering pipeline with motion prediction, late latching and asynchronous timewarp functionality
    • Calibration algorithms for optical parameters (lens distortion, optical see-through calibration)
    • Full integration in commonly used driver frameworks like OpenVR and OpenXR
    • Specific algorithms and tools to enable VR / AR in vehicles (car, plane etc.) or motion simulators
Overview of LPVR Middleware Functionality

Application of LPVR Middleware to In-Car VR / AR

The tracking backend of the LPVR middleware solution for VR and AR is especially advanced in the aspect that it allows the flexible combination of multiple optical systems and inertial measurement units (IMUs) for combined position and orientation tracking. Specifically it enables the de-coupling of the head motion of a user and the motion of a vehicle the user might be riding in, such as a car or airplane.

As shown in the illustration below, in this way the interior of a vehicle can be displayed as static relative to the user, while the scenery in the environment of the vehicle moves with vehicle motion.

Illustration of In-car VR Installation

For any application of augmented reality or virtual reality application in a moving vehicle, this functionality is essential to provide an immersive experience to the user. LP-Research is the industry leader for providing customized sensor fusion solutions for augmented and virtual reality.

If you have interest in this solution, please contact us to start discussing your applications case.

1 2