Immersive Driving Assistance with LPVIZ

How LPVIZ Augments Driving Reality

Going beyond a simple screen replacement, LPVIZ is an augmented reality driving assistance solution for the car. It allows displaying related content to a driver or passenger in 3D, superimposed to reality. Content can be placed anywhere inside the car, such as a virtual speedometer over the dashboard, and anywhere outside of the car, such as point-of-interest markers or navigation guidance.

Driving Assistance System From the Future

The video below shows what a drive around the block in Azabujuban, Tokyo with LPVIZ looks like. A virtual dashboard is projected onto the center console of the vehicle. Arrows on the ground show lane guidance to the driver. Red Google Maps-style markers show points of interest. The virtual dashboard stays fixed to the same location in the car, even when the vehicle turns. The navigation arrows move smoothly and the point-of-interest markers are globally anchored.

Perfectly Tuned Components

LPVIZ consists of several components that all have to interact perfectly to create a compelling and safe augmentation experience. The below illustration shows a block diagram of how the hardware components are connected.

Accurate tracking is required to display useful content to the driver: the HMD pose in the local car coordinate system and the vehicle pose in a globally anchored frame. Precise calibration of all components of the solution is essential to provide the highest visual fidelity and driver safety. Our LPVIZ product makes all parts of the system available in a compact form factor, ready to be integrated with any vehicle.

The Past, Present and the Future

In the current development stage we’re focusing on the most essential aspects of the solution: displaying a virtual dashboard, navigation information and points-of-interest. While this is our proprietary content, we’re opening our software to work with 3rd party developers to create their own content building on our platform.

Currently we’re offering LPVIZ as a B2B solution for prototyping, design and research. However, we’re working on reducing system complexity to make it work as a consumer facing automotive after-market solution to be released later this year.

Towards a Consumer Product

We are very proud of the progress our team has made in the past months. We’re moving closer to making our vision of an augmented reality driving assistance system a reality for everyone. One very important take-away from our recent developments is that it’s indeed possible to provide real utility to the driver using technology that is readily available. It might still be early days, but we’re edging towards a product that could appeal to a wider consumer market. This is just the beginning.

LPVR-DUO in an Airborne Helicopter

In-Flight VR

Imagine soaring through the skies as a pilot, testing the limits of a helicopter’s capabilities while feeling the rush of wind and turbulence. Now imagine that you don’t see the real world outside and the safe landing pad that your helicopter is approaching but a virtual reality (VR) scene where you are homing in on a ship in high seas. The National Research Council Canada (NRC) and Defence Research and Development Canada (DRDC) have brought this experience to life with their groundbreaking Integrated Reality In-Flight Simulation (IRIS).

IRIS is not your ordinary simulator; for one, it’s not sitting on a hexapod, it’s airborne. It’s a variable-stability helicopter based on the Bell 412 that can behave like other aircraft and can simulate varying weather conditions; combine that with a VR environment and you have a tool that allows safely training operations in the most adverse conditions. In particular it is used for Ship Helicopter Operating Limitations (SHOL) testing.

Mission-Critical Application with LPVR-DUO

The LPVR-DUO system is what makes VR possible on this constantly moving platform. This cutting-edge AR/VR tracking system seamlessly merges the inertial measurements taken by the headset with the helicopter’s motion data and a camera system mounted inside the cabin to provide the correct visuals to the pilot. The challenges of using cameras to track the VR headset inside the tight environment of the helicopter while lighting conditions are ever-changing are overcome by using an ART SmartTrack 3 system. This system follows an arrangement of reflective markers attached to the pilot’s helmet. The VR headset is attached to the helmet in such a way that the pilot can wear it as if it were a pair of night vision goggles. Put together, this allows displaying a virtual world to the pilot, even in the most extreme maneuvers.

To ensure an authentic experience, the IRIS system incorporates real-time turbulence models, meticulously crafted from wind tunnel trials. These turbulence effects are seamlessly integrated into the aircraft’s motion and into the VR scene, providing pilots with precise proprioceptive and vestibular cues. It’s a symphony of technology and innovation in the world of aviation testing.

In-Cockpit Implementation

The optical tracking system relies on highly reflective marker targets on the helmet to track movement in three dimensions. Initially, only five markers were installed, strategically placed for optimal tracking. But the pursuit of perfection led NRC to create custom 3D-printed low-reflectivity helmet molds, allowing them to mount a dozen small passive markers. This significantly improved tracking reliability in various lighting conditions and allowed for a wider range of head movement.

Recently, NRC put this remarkable concept to the test with actual flight trials. The response from pilots was nothing short of exhilarating. They found the system required minimal adaptation, exhibited no noticeable lag, and, perhaps most impressively, didn’t induce any motion sickness. Even the turbulence effects felt incredibly realistic. Surprisingly, the typical VR drawbacks, such as resolution and field of view limitations, had minimal impact, especially during close-in shipboard operations. It’s safe to say that IRIS has set a new standard for effective and immersive aviation testing.

Publication of Results

The NRC team presented their results at the Vertical Flight Society’s 79th Annual Form in two papers [1]  and [2] and they also have a blog post on their site.

NOTE: Image contents courtesy of Aerospace Research Centre, National Research Council of Canada (NRC) – Ottawa, ON, Canada

Breaking News – LPVR Support for Varjo XR-4

Varjo Releases Stunning New AR Headset XR-4

– The new Varjo XR-4 headset. Image credit: Varjo

A leap forward in XR comes from our partners at Varjo, who have been pushing what is possible in VR and AR for the past few years.

Today, they announced their new flagship series of headsets, the XR-4! And, of course, we have worked with them to ensure that our LPVR software series is ready for it from the start.

The XR-4 boasts unmatched visual fidelity not only of the VR content (expanded to a field-of-view of 120×105 deg) but also of the mixed-reality pass-through, reaching an unprecedented pixel density of 51ppd in the central area. High-tech light sensors give an unparalleled quality of immersion by adjusting to external lighting conditions.

LPVR Support is Ready

All this is great, but what is a headset without a world to immerse in? What is a VR racetrack without feeling the real motions of the car? Varjo clearly understood and prepared, so they collaborated with us to get LPVR ready for the launch of the XR-4.

– Varjo XR-4 headset with LP-Research custom marker holder

We have everything prepared for you!

  • Use your existing camera systems and props to augment the virtual world.
  • Use the industrial-grade precision of ART, OptiTrack, and Vicon tracking systems with the Varjo XR-4 and our custom marker holders.
  • Integrate the HMD with your race-car simulator or fighter jet platform.
  • Do all of this with the XR-4. LPVR-CAD, and LPVR-DUO will make sure that you are tracked perfectly.

 

We offer full solutions of state-of-the-art tracking systems and content using the Varjo XR-4. Contact us for more information.

New Features in LPVR Version 4.8

Introduction

Our LPVR series is the primary solution on the market for users who want to expand the scope of their virtual reality or mixed reality headsets by using external tracking systems such as ART, OptiTrack or Vicon. Use cases are varied and range from entertainment (location based VR) and engineering use cases (ergonomic studies in AR) to helicopters and virtual cars which are actually driving on Japan’s public roads. At LP-Research, we have continuously developed the LPVR series of solutions over the past years. We have expanded its scope, added support for new headsets, and included new functions.

The image below shows an LPVR installation based on design content created by automotive prototyping company Phiaro Inc. in Tokyo, Japan.

The latest release is version 4.8.0, which we released in June of 2023.  As usual, it comes in two flavors:

  • LPVR-CAD which supports stationary use-cases, and
  • LPVR-DUO which is our variant for moving platforms, be they cars or simulators.

We support all the major tethered headsets (SteamVR headsets, Pimax, Varjo).  We also support Meta Quest series headsets and the Vive Focus 3 with our LPVR-Air series of products. If you have a current support contract, you are eligible for an update.

A Brief Overview of LPVR-CAD and LPVR-DUO

It’s maybe best to summarize some of the capabilites that our products add to the various commercial headsets.  For more details, feel free to visit the product pages for LPVR-CAD and LPVR-DUO, respectively:

  • Cover arbitrary large areas and have VR scenes taking place in them
  • Have an arbitrary number of users interact in such a space
  • Do VR/AR inside a car or any other moving platform
  • Track your user to sub-millimeter precision together with any number of props with no perceivable latency
  • Use SteamVR controllers without the Lighthouses

We can do this because our proprietary sensor fusion algorithms allow us to combine the measurements of high-precision motion tracking camera systems with the measurements of the headset’s Inertial Measurement Unit (IMU). For the case of a moving platform, we can additionally incorporate data from an IMU installed on the platform to provide for a responsive, accurate performance also in those circumstances.

New Features

For a short overview of the changes in each version, please refer to our Release Notes. Here we will give some highlights and dig into some details. LPVR 4.8.0 is the result of continuous development in the half year or so since our previous releases.

New GUI Organization and Visual LPVR-DUO Configuration Interface

The most obvious change to users will be the reorganized GUI which streamlines the setup, completely doing away with the need to enter any JSON codes, while coming on a more cleanly organized surface. Especially for our LPVR-DUO users this means a vast simplification of the system.  We have maintained the old configuration interface as an option to guarantee compatibility with existing workflows, but we don’t think that users will have to resort to it. Please let us know if your experience is different. If your headset tracking body is already calibrated, you should now be able to setup LPVR-DUO with some five mouse clicks.

When you load up the configuration, it will look something like this. Note that you no longer are led to a JSON editor where you manually have to enter the configuration. Instead you are greeted by a friendly, informative GUI.

At the bottom of the page, you will see links to the Documentation, a Calibration screen, and an Expert Mode, basically the old JSON editor. The Calibration screen is used for the setup of the Platform IMU and simplifies it down to a few mouse clicks in the usual case. No more looking for some quaternion values in log files! Please check out the corresponding documentation.

Varjo Headset Eye Point Adjustments

Together with Varjo and with cooperation of several of our customers we were able to identify and correct some imprecisions in the handling of the headset’s position. These would show up as small coordinate mismatches between the optical tracking coordinates and the coordinates reported to VRED or Unity etc. Additionally, this would lead to some unnatural motion of AR overlays, especially when turning the head.

Optimal performance requires updating both Varjo Base to at least version 3.10 and LPVR to at least version 4.8.0.  Updating Varjo Base fixes the underlying issue, updating LPVR corrects the interfacing.  If you cannot update Varjo Base, you can still update LPVR-CAD-Varjo to version 4.8.0 and enable a workaround.  To do so, please open the Varjo Base configuration GUI on the System tab and then add patchPositionBug=true in the field labeled Additional Settings followed by clicking the “Submit” button. Note while this works around the issue in Varjo Base before version 3.10, it is not recommended to use this option with the updated versions of Varjo Base.

Varjo Configuration Refinements

Different environments call for different setups.  Some of our users use administrator accounts, others have multiple users but want them to use the same configuration.  We have updated the way we organize on-disc storage of the configuration to address these possibilities.  In particular you can now establish a system-wide configuration default, and you can override it per-user.  In the case of LPVR-CAD, additionally, the configuration is entered inside Varjo Base by default, but to allow users greater flexibility, it has always been possible to use our web interface or files on disk to perform the configuration.  While these are not the preferred choice, it was previously not possible to see from Varjo Base whether the on-disk configuration is in use.  We have added a prominent status information that points to the configuration, as in the screen shot below.  In the case of LPVR-DUO the configuration is always loaded from disk as the added flexibility of our configuration page is required,, but in LPVR-CAD the user will have to opt in. We describe the process briefly below.

The user can setup a global, systemwide default configuration in %ProgramData%/Varjo/VarjoTracking/Plugins/LP-Research/LPVR-CAD-Varjo/configuration/settings.json. Changes on the configuration page will not change this configuration, but will instead be written to the per-user configuration %LocalAppData%/LP-Research/LPVR-CAD-Varjo/settings.json. If either file is present, the configuration inside Varjo Base will be ignored. For LPVR-DUO, there is no configuration interface inside Varjo Base, instead the user will always point their web browser to http://localhost:7119. This configuration relies on the same files, but with the subdirectory LPVR-CAD replaced by LPVR-DUO. So a system-wide default configuration can be placed in %ProgramData%/Varjo/VarjoTracking/Plugins/LP-Research/LPVR-DUO-Varjo/configuration/settings.json, and a per-user override can sit in %LocalAppData%/LP-Research/LPVR-DUO-Varjo/settings.json.

LPVR-DUO Demonstration

In order to familiarize you with the neighborhood of our office and, more importantly, to show what can be done with LPVR-DUO, here is an in-car mixed reality demonstration. The video screens on the glove box may look almost real but they are an overlay imposed on the see-through camera image of a Varjo XR-3 using an out-of-the-box LPVR-DUO set. Notice how the screens firmly remain in place during turns of the user’s head as well as turns of the car itself, even when diving into some of the steeper roads of the Motoazabu area in central Tokyo.

How to Connect an LP-Research IMU to ROS (Update)

Introduction

This article describes how to connect an LP-RESEARCH inertial measurement unit (IMU) using a Robot Operating System (ROS) node. We are happy to announce that our IMU ROS sensor driver has been accepted into the official ROS package repository. The Robot Operating System, or ROS in short, is an open-source de-facto standard for robotics sensing and control.

With the package openzen_sensor now provided as part of the ROS distribution Melodic Morenia it just became a whole lot easier to use our sensors in robotic applications.

Note: This article covers our node for ROS 1. Please see further information regarding our ROS 2 node at the end of this article. This post is a follow-up to our previous ROS driver release.

Published ROS Topics

These are the ROS topics which are published by the OpenZen ROS driver:

Message

Type

Description

/imu/data

Inertial data from the IMU. Includes calibrated acceleration, calibrated angular rates and orientation. The orientation is always unit quaternion.

/imu/mag

Magnetometer reading from the sensor.

/imu/nav

Global position from a satellite navigation system. Only available if the IMU includes a GNSS chip.

/imu/is_autocalibration_active

Latched topic indicating if the gyro autocalibration feature is active.

Installation of the LPMS ROS Driver

All that’s needed is to install the package openzen_sensor via your Linux distribution’s package manager. In Ubuntu, with the ROS Melodic Morenia distribution installed, use the following command:

Once the IMU ROS driver package is installed, we use the following command to start the OpenZen node:

This will automatically connect to the first available IMU and start streaming its accelerometer, gyroscope and magnetometer data to ROS. If your sensor is equipped with a GPS unit, global positioning information will also be transferred to ROS.

Once a sensor has been connected via the motion sensor driver, the data from the sensor is exported via ROS topics which can be consumed by other ROS components such as a navigation and path planning system.

Outputting IMU sensor values on the command line can now be easily done with:

and the data can be plotted with:

More information on the usage of the OpenZen IMU ROS driver can be found in the repository of the driver.

The image above shows an angular velocity output graph in the ROS MatPlot application from an LPMS-IG1 sensor.

ROS 2 Release

We have recently released a ROS 2 version of our OpenZEN ROS node. The node is not part of an official ROS2 release yet, but it works well on the latest release Foxy. For surther information and source code see the OpenZenROS2 repository.

1 2 3 6