About Tobias Schlüter

Before I came to LP-Research, I used to do particle physics research. My publications can be found here. At LP-Research, I am responsible for our LPVR series of products. My title is Chief Scientist.

LPVR-DUO in an Airborne Helicopter

In-Flight VR

Imagine soaring through the skies as a pilot, testing the limits of a helicopter’s capabilities while feeling the rush of wind and turbulence. Now imagine that you don’t see the real world outside and the safe landing pad that your helicopter is approaching but a virtual reality (VR) scene where you are homing in on a ship in high seas. The National Research Council Canada (NRC) and Defence Research and Development Canada (DRDC) have brought this experience to life with their groundbreaking Integrated Reality In-Flight Simulation (IRIS).

IRIS is not your ordinary simulator; for one, it’s not sitting on a hexapod, it’s airborne. It’s a variable-stability helicopter based on the Bell 412 that can behave like other aircraft and can simulate varying weather conditions; combine that with a VR environment and you have a tool that allows safely training operations in the most adverse conditions. In particular it is used for Ship Helicopter Operating Limitations (SHOL) testing.

Mission-Critical Application with LPVR-DUO

The LPVR-DUO system is what makes VR possible on this constantly moving platform. This cutting-edge AR/VR tracking system seamlessly merges the inertial measurements taken by the headset with the helicopter’s motion data and a camera system mounted inside the cabin to provide the correct visuals to the pilot. The challenges of using cameras to track the VR headset inside the tight environment of the helicopter while lighting conditions are ever-changing are overcome by using an ART SmartTrack 3 system. This system follows an arrangement of reflective markers attached to the pilot’s helmet. The VR headset is attached to the helmet in such a way that the pilot can wear it as if it were a pair of night vision goggles. Put together, this allows displaying a virtual world to the pilot, even in the most extreme maneuvers.

To ensure an authentic experience, the IRIS system incorporates real-time turbulence models, meticulously crafted from wind tunnel trials. These turbulence effects are seamlessly integrated into the aircraft’s motion and into the VR scene, providing pilots with precise proprioceptive and vestibular cues. It’s a symphony of technology and innovation in the world of aviation testing.

In-Cockpit Implementation

The optical tracking system relies on highly reflective marker targets on the helmet to track movement in three dimensions. Initially, only five markers were installed, strategically placed for optimal tracking. But the pursuit of perfection led NRC to create custom 3D-printed low-reflectivity helmet molds, allowing them to mount a dozen small passive markers. This significantly improved tracking reliability in various lighting conditions and allowed for a wider range of head movement.

Recently, NRC put this remarkable concept to the test with actual flight trials. The response from pilots was nothing short of exhilarating. They found the system required minimal adaptation, exhibited no noticeable lag, and, perhaps most impressively, didn’t induce any motion sickness. Even the turbulence effects felt incredibly realistic. Surprisingly, the typical VR drawbacks, such as resolution and field of view limitations, had minimal impact, especially during close-in shipboard operations. It’s safe to say that IRIS has set a new standard for effective and immersive aviation testing.

Publication of Results

The NRC team presented their results at the Vertical Flight Society’s 79th Annual Form in two papers [1] and [2] and they also have a blog post on their site.

NOTE: Image contents courtesy of Aerospace Research Centre, National Research Council of Canada (NRC) – Ottawa, ON, Canada

New Features in LPVR Version 4.8

Introduction

Our LPVR series is the primary solution on the market for users who want to expand the scope of their virtual reality or mixed reality headsets by using external tracking systems such as ART, OptiTrack or Vicon. Use cases are varied and range from entertainment (location based VR) and engineering use cases (ergonomic studies in AR) to helicopters and virtual cars which are actually driving on Japan’s public roads. At LP-Research, we have continuously developed the LPVR series of solutions over the past years. We have expanded its scope, added support for new headsets, and included new functions.

The image below shows an LPVR installation based on design content created by automotive prototyping company Phiaro Inc. in Tokyo, Japan.

The latest release is version 4.8.0, which we released in June of 2023.  As usual, it comes in two flavors:

  • LPVR-CAD which supports stationary use-cases, and
  • LPVR-DUO which is our variant for moving platforms, be they cars or simulators.

We support all the major tethered headsets (SteamVR headsets, Pimax, Varjo).  We also support Meta Quest series headsets and the Vive Focus 3 with our LPVR-Air series of products. If you have a current support contract, you are eligible for an update.

A Brief Overview of LPVR-CAD and LPVR-DUO

It’s maybe best to summarize some of the capabilites that our products add to the various commercial headsets.  For more details, feel free to visit the product pages for LPVR-CAD and LPVR-DUO, respectively:

  • Cover arbitrary large areas and have VR scenes taking place in them
  • Have an arbitrary number of users interact in such a space
  • Do VR/AR inside a car or any other moving platform
  • Track your user to sub-millimeter precision together with any number of props with no perceivable latency
  • Use SteamVR controllers without the Lighthouses

We can do this because our proprietary sensor fusion algorithms allow us to combine the measurements of high-precision motion tracking camera systems with the measurements of the headset’s Inertial Measurement Unit (IMU). For the case of a moving platform, we can additionally incorporate data from an IMU installed on the platform to provide for a responsive, accurate performance also in those circumstances.

New Features

For a short overview of the changes in each version, please refer to our Release Notes. Here we will give some highlights and dig into some details. LPVR 4.8.0 is the result of continuous development in the half year or so since our previous releases.

New GUI Organization and Visual LPVR-DUO Configuration Interface

The most obvious change to users will be the reorganized GUI which streamlines the setup, completely doing away with the need to enter any JSON codes, while coming on a more cleanly organized surface. Especially for our LPVR-DUO users this means a vast simplification of the system.  We have maintained the old configuration interface as an option to guarantee compatibility with existing workflows, but we don’t think that users will have to resort to it. Please let us know if your experience is different. If your headset tracking body is already calibrated, you should now be able to setup LPVR-DUO with some five mouse clicks.

When you load up the configuration, it will look something like this. Note that you no longer are led to a JSON editor where you manually have to enter the configuration. Instead you are greeted by a friendly, informative GUI.

At the bottom of the page, you will see links to the Documentation, a Calibration screen, and an Expert Mode, basically the old JSON editor. The Calibration screen is used for the setup of the Platform IMU and simplifies it down to a few mouse clicks in the usual case. No more looking for some quaternion values in log files! Please check out the corresponding documentation.

Varjo Headset Eye Point Adjustments

Together with Varjo and with cooperation of several of our customers we were able to identify and correct some imprecisions in the handling of the headset’s position. These would show up as small coordinate mismatches between the optical tracking coordinates and the coordinates reported to VRED or Unity etc. Additionally, this would lead to some unnatural motion of AR overlays, especially when turning the head.

Optimal performance requires updating both Varjo Base to at least version 3.10 and LPVR to at least version 4.8.0.  Updating Varjo Base fixes the underlying issue, updating LPVR corrects the interfacing.  If you cannot update Varjo Base, you can still update LPVR-CAD-Varjo to version 4.8.0 and enable a workaround.  To do so, please open the Varjo Base configuration GUI on the System tab and then add patchPositionBug=true in the field labeled Additional Settings followed by clicking the “Submit” button. Note while this works around the issue in Varjo Base before version 3.10, it is not recommended to use this option with the updated versions of Varjo Base.

Varjo Configuration Refinements

Different environments call for different setups.  Some of our users use administrator accounts, others have multiple users but want them to use the same configuration.  We have updated the way we organize on-disc storage of the configuration to address these possibilities.  In particular you can now establish a system-wide configuration default, and you can override it per-user.  In the case of LPVR-CAD, additionally, the configuration is entered inside Varjo Base by default, but to allow users greater flexibility, it has always been possible to use our web interface or files on disk to perform the configuration.  While these are not the preferred choice, it was previously not possible to see from Varjo Base whether the on-disk configuration is in use.  We have added a prominent status information that points to the configuration, as in the screen shot below.  In the case of LPVR-DUO the configuration is always loaded from disk as the added flexibility of our configuration page is required, but in LPVR-CAD the user will have to opt in. We describe the process briefly below.

The user can prepare a global, system-wide default configuration in %ProgramData%/Varjo/VarjoTracking/Plugins/LP-Research/LPVR-CAD-Varjo/configuration/settings.json. Changes on the configuration page will not change this configuration, but will instead be written to the per-user configuration %LocalAppData%/LP-Research/LPVR-CAD-Varjo/settings.json. If either file is present, the configuration inside Varjo Base will be ignored and the user can use their web browser to configure LPVR-CAD. In this way, an administrator can prepare a configuration that works with the setup, and any user can customize it to their needs. For LPVR-DUO, there is no configuration interface inside Varjo Base, instead the user will always point their web browser to http://localhost:7119. Here, a system-wide default configuration can be placed in %ProgramData%/Varjo/VarjoTracking/Plugins/LP-Research/LPVR-DUO-Varjo/configuration/settings.json, and a per-user override can sit in %LocalAppData%/LP-Research/LPVR-DUO-Varjo/settings.json. The web interface will always update this per-user file.

LPVR-DUO Demonstration

In order to familiarize you with the neighborhood of our office and, more importantly, to show what can be done with LPVR-DUO, here is an in-car mixed reality demonstration. The video screens on the glove box may look almost real but they are an overlay imposed on the see-through camera image of a Varjo XR-3 using an out-of-the-box LPVR-DUO set. Notice how the screens firmly remain in place during turns of the user’s head as well as turns of the car itself, even when diving into some of the steeper roads of the Motoazabu area in central Tokyo.

Large-scale VR Application Case: the Holodeck Control Center

The AUDI Holodeck

LPVR interaction

Our large-scale VR solution allows any SteamVR-based (e.g. Unity, Unreal, VRED) Virtual Reality software to seamlessly use the HTC VIVE headset together with most large-room tracking systems available on the market (OptiTrack, Vicon, ART). It enables easy configuration and fits into the SteamVR framework, minimizing the effort needed to port applications to large rooms.

One of our first users, Lightshape, have recently released a video showing what they built with our technology.  They call it the Holodeck Control Center, an application which creates multi-user collaborative VR spaces. In it users can communicate and see the same scene whether they are the same real room or in different locations. The installation showcased in the video is used by German car maker Audi to study cars that haven’t been built yet.

Our technology is essential in order to get the best VR experience possible on the 15m × 15m of the main VR surface, combining optical tracking data and IMU measurements to provide precise and responsive positioning of the headsets.  Please have a look at Lightshape’s video below.

Ready for the HTC Vive Pro

In the near future, this installation will be updated to the HTC Vive Pro which our software already supports. The increased pixel density of this successor of the HTC Vive will make the scenes look even more realistic. The resolution is high enough to actually read the various panels once you are in the drivers seat! Besides that, we are also busy studying applications of the front-facing cameras of the Vive Pro in order to improve multi-user interaction.

Location-based VR Tracking for All SteamVR Applications

LPVR Pipeline Overview

UPDATE 1 – LPVR now offers VIVE Pro support!

UPDATE 2 – LPVR can now talk to all optical tracking systems that support VRPN (VICON, ART etc.).

UPDATE 3 – Of special interest to automotive customers may be that this also supports Autodesk VRED.

Current VR products cannot serve several important markets due to limitations of their tracking systems.  Out of the box, both the HTC Vive and the Oculus Rift are limited to tracking areas smaller than 5m x 5m, which is too small for most multi-player applications. We have previously presented our solution that combines our motion sensing IMU technology, OptiTrack camera-based tracking and the HTC Vive to allow responsive multiplayer VR experiences over larger areas. Because of the necessary interfacing this means that applications still need to be prepared specifically for this solution.

We have now further improved our software stack such that we can provide a SteamVR driver for our solution. What this means is that on the one hand any existing SteamVR application automatically supports the arbitrarily large tracking areas covered by the OptiTrack system. On the other hand it means that no additional plugins for Unity, Unreal or your development platform of choice is needed — support is automatic. Responsive behavior is guaranteed by using LP-RESEARCH’s IMU technology in combination with standard low-latency VR technologies like asynchronous time warping, late latching etc. An overview of the functionality of the system is shown in the image above.

LPMS-NAV Navigation Sensors for Mobile Robots and Automated Guided Vehicles (AGV)

We are proud to announce the release of a new series of high-precision sensors for applications in autonomous vehicle navigation. The sensors are based on quartz-vibration gyroscopes with low-noise, low-drift characteristics. They have excellent capabilities for measurement of slow to medium speed rotations.

We offer the new sensors in various versions with different communication interfaces and housing options: LPMS-NAV2, LPMS-NAV2-RS232 and LPMS-NAV2-RS422. Please check more detailed information on our products page

The following video shows a use-case of one of our customers in China. The company is using LPMS-NAV2-RS232 sensor for mobile robot navigation. Automatic navigation of automated guided vehicles (AGV) or cleaning robots are two of the principal application areas of the LPMS-NAV series.

If you have any interest in this product, please contact us for further information.

1 2