Revolutionizing AD/ADAS Testing: VR-Enhanced Vehicle-in-the-Loop

The automotive industry is in a race to develop smarter, safer, and more efficient vehicles. To meet these demands, engineers rely on sophisticated development processes. At LP-RESEARCH, we’re committed to creating tools that shape the future of mobility. That’s why we’re excited to announce a groundbreaking collaboration with IPG Automotive.

By integrating our advanced hardware and software with IPG Automotive’s CarMaker, we’ve created an immersive Virtual Reality (VR) experience for Vehicle-in-the-Loop (ViL) testing. This powerful solution allows engineers to test Autonomous Driving and Advanced Driver-Assistance Systems (AD/ADAS) on a proving ground with unprecedented realism and efficiency.

What is Vehicle-in-the-Loop (ViL)?

Vehicle-in-the-Loop is a powerful testing method that blends real-world driving with virtual simulation. An AD/ADAS-equipped vehicle drives on a physical test track while interacting with a dynamic virtual environment in real time.

This approach lets engineers observe the vehicle’s response to countless simulated scenarios under controlled, repeatable conditions. The vehicle’s real-world dynamics are continuously fed back into the simulation, ensuring the virtual world perfectly mirrors the physical state of the car. The test vehicle is outfitted with a seamless integration of hardware and software to support this constant flow of data.

The Technology Stack: A Powerful Combination

Our collaboration combines best-in-class hardware and software from both LP-RESEARCH and IPG Automotive to deliver a complete ViL solution.

LP-RESEARCH Stack

LPPOS: Our hardware system for acquiring physical data from the vehicle via ELM327 or OBDLink, completed by Global Navigation Satellite System (GNSS) antennas, and advanced Inertial Measurement Units (IMUs). LPPOS includes FusionHub, our sensor-fusion software for high-precision, real-time vehicle state estimation. More information here

LPVR-DUO: Specialized sensor-fusion software that calculates the Head-Mounted Display (HMD) pose relative to the moving vehicle. More information here

ART SMARTRACK3: An advanced Infrared (IR) camera tracking system from our partner, Advance Realtime Tracking (ART), for precise head tracking.

IPG Automotive Stack

CarMaker Office: The simulation environment, enabled with the ViL add-on.

Movie NX: The visualization tool, enhanced with the VR add-on to create the immersive experience.

How It All Works Together

The key to this integration is a custom plugin that connects out FusionHub software with CarMaker. This plugin translates the real vehicle’s precise position and orientation (it’s “pose“) into the virtual environment.

The system workflow is a seamless loop of data capture, processing, and visualization:

Data Acquisition: LPPOS gathers vehicle data (OBD), GNSS, and IMU measurements and sends it to FusionHub. The SMARTRACK system monitors the HMD’s position, while IMUs on the headset and vehicle platform send orientation data to LPVR-DUO.

Sensor Fusion: FusionHub processes its inputs to calculate the vehicle’s exact pose in the real world. LPVR-DUO calculates the HMD’s pose relative to the moving vehicle’s interior.

Real-Time Communication: FusionHub streams the vehicle’s pose to a dedicated TCP server, which feeds the data directly into the CarMaker simulation via our custom plugin. LPVR-DUO communicates the headset’s pose to Movie NX using OpenVR, allowing the driver or engineer to naturally look around the virtual scene from inside the real car.

The entire LP-RESEARCH software stack and CarMaker Office run concurrently on a single computer inside the test vehicle, creating a compact and powerful setup.

See It in Action

We configured a scene in the CarMaker Scenario Editor that meticulously replicates our test track near the LP-RESEARCH Tokyo Office. The video on the top of this post demonstrates the fully integrated system, showcasing how the vehicle’s real-world position perfectly matches its virtual counterpart. Notice how the VR perspective shifts smoothly as the copilot moves their head inside the vehicle.

This setup vividly illustrates how VR technology makes ViL testing more immersive, effective, and even fun.

Advance Your ViL Testing Today

Are you ready to integrate cutting-edge virtual reality into your Vehicle-in-the-Loop testing and help shape the future of mobility?

Fine-tuning the HMD view and virtual vehicle reference frame is crucial for an accurate simulation and depends on the specific test vehicle and scenario. Our team has the expertise to configure these parameters for you or provide expert guidance to ensure a perfect setup.

Contact us today to learn how our tailored solutions and expert support can elevate your AD/ADAS development process.

Introducing LP-Research’s SLAM System with Full Fusion for Next-Gen AR/VR Tracking

At LP-Research, we have been pushing the boundaries of spatial tracking with our latest developments in Visual SLAM (Simultaneous Localization and Mapping) and sensor fusion technologies. Our new SLAM system, combined with what we call “Full Fusion,” is designed to deliver highly stable and accurate 6DoF tracking for robotics, augmented and virtual reality applications.

System Setup

To demonstrate the progress of our development, we ran LPSLAM together with FusionHub on a host computer and forwarded the resulting pose to a Meta Quest 3 mixed reality headset for visualization using LPVR-AIR. We created a custom 3D-printed mount to affix the sensors needed for SLAM and Full Fusion, a ZED Mini stereo camera and an LPMS-CURS3 IMU sensor onto a the headset.

This mount ensures proper alignment of the sensor and camera with respect to the headset’s optical axis, which is critical for accurate fusion results. The system connects via USB and runs on a host PC that communicates wirelessly with the HMD. An image of how IMU and camera are attached to the HMD is shown below.

In the current state of our developments we ran tests in our laboratory. The images below show a photo of the environment next to how this environment translates into an LPSLAM map.

Tracking Across Larger Areas

A walk through the office based on a pre-built map yields good results. The fusion in this experiment is our regular IMU-optical fusion and therefore doesn’t support translation information with integrating accelerometer data. This leads to short interruptions of position tracking in certain areas where feature points aren’t found. We at least partially solve this problem with the full fusion shown in the next paragraph.

What is Full Fusion?

Traditional tracking systems rely either on Visual SLAM or IMU (Inertial Measurement Unit) data, often with one compensating for the other. Our Full Fusion approach goes beyond orientation fusion and integrates both IMU and SLAM data to estimate not just orientation but also position. This combination provides smoother, more stable tracking even in complex, dynamic environments where traditional methods tend to struggle.

By fusing IMU velocity estimates with visual SLAM pose data through a through a specialized filter algorithm, our system handles rapid movements gracefully and removes jitter seen in pure SLAM-only tracking. The IMU handles fast short-term movements while SLAM ensures long-term positional stability. Our latest releases even support alignment using fiducial markers, allowing the virtual scene to anchor precisely to the real world. The video below shows the SLAM in conjunction with the Full Fusion.

Real-World Testing and Iteration

We’ve extensively tested this system in both lab conditions and challenging real-world environments. Our recent experiments demonstrated excellent results. By integrating our LPMS IMU sensor and running our software pipeline (LPSLAM and FusionHub), we achieved room-scale tracking with sub-centimeter accuracy and rotation errors as low as 0.45 degrees.

In order to evaluate the performance of the overall solution we compared the output from FusionHub with pose data recorded by an ART Smarttrack 3 tracking system. The accuracy of an ART tracking system is in the sub-mm range and therefore is sufficienty accurate to characterize the performance of our SLAM. The result of one of several measurement runs is shown in the image below. Note that both systems were alignment and timestamp synchronized to correctly compare poses.

Developer-Friendly and Cross-Platform

The LP-Research SLAM and FusionHub stack is designed for flexibility. Components can run on the PC and stream results to an HMD wirelessly, enabling rapid development and iteration. The system supports OpenXR-compatible headsets and has been tested with Meta Quest 3, Varjo XR-3, and more. Developers can also log and replay sessions for detailed tuning and offline debugging.

Looking Ahead

Our roadmap includes support for optical flow integration to improve SLAM stability further, expanded hardware compatibility, and refined UI tools for better calibration and monitoring. We’re also continuing our efforts to improve automated calibration and simplify the configuration process.

This is just the beginning. If you’re building advanced AR/VR systems and need precise, low-latency tracking that works in the real world, LP-Research’s Full Fusion system is ready to support your journey.

To learn more or get involved in our beta program, reach out to us.

Wireless Mixed Reality with LPVR-AIR 3.3 and Meta Quest

Achieving Accurate Mixed Reality Overlays

In a previous blog post we’ve shown the difficulties of precisely aligning virtual and real content using the Varjo XR-3 mixed reality headset. In spite of the Varjo XR-3 being a high quality headset and accurate tracking using LPVR-CAD we had difficulties reaching correct alignment for different angles and distances from an object. We concluded that the relatively wide distance between video passthrough cameras and the displays of the HMD causes distortions that are hard to be corrected by the Varjo HMD’s software.

Consumer virtual reality headsets like the Meta Quest 3 have only recently become equipped with video passthrough cameras and displays that operate at similar image quality as the Varjo headsets. We have therefore started to extend our LPVR-AIR wireless VR software with mixed reality capabilities. This allows us to create similar augmented reality scenarios with the Quest 3 as with the Varjo XR series HMDs.

Full MR Solution with LPVR-AIR and Meta Quest

The Quest 3 is using pancake optics that allow for a much closer distance between passthrough cameras and displays. Therefore the correction of the camera images the HMD has to apply to align virtual and real content accurately is reduced. We show this in the video above. We’re tracking the HMD using our LPVR-AIR sensor fusion and an ART Smarttrack 3 outside-in tracking system. Even though the tracking accuracy we can reach with the tracking camera placed relatively far away from the HMD is limited, we achieve a very good alignment between the virtual cube and real cardboard box, even with varying distances from the object.

This shows that using a consumer grade HMD like the Meta Quest 3 with a cost-efficient outside-in tracking solution a state-of-the-art mixed reality setup can be achieved. The fact that the Quest 3 is wirelessly connected to the rendering server adds to the ease-of-use of this solution.

The overlay accuracy of this solution is superior to all other solutions on the market that we’ve tried. Marker-based outside-in tracking guarantees long-term accuracy and repeatability, which is usually an inssue with inside-out or Lighthouse-based tracking. This functionality is supported from LPVR-AIR version 3.3.

Controller and Optical Marker Tracking

In addition to delivering high-quality mixed reality and precise wireless headset tracking, LPVR-AIR seamlessly integrates controllers tracked by the HMD’s inside-out system with objects tracked via optical targets in the outside-in tracking frame, all within a unified global frame. The video above shows this unique capability in action.

When combined with our LPVR-CAD software, LPVR-AIR enables the tracking of any number of rigid bodies within the outside-in tracking volume. This provides an intuitive solution for tracking objects such as vehicle doors, steering wheels, or other cockpit components. Outside-in optical markers are lightweight, cost-effective, and require no power supply. With camera-based outside-in tracking, all objects within the tracking volume remain continuously tracked, regardless of whether the user is looking at them. They can be positioned with millimeter accuracy and function reliably under any lighting conditions, from bright daylight to dark studio environments.

In-Car Head Tracking with LPVR-AIR

After confirming the capability of LPVR-AIR to work well with large room scale mixed reality setups, we started developing the system’s functionality to do accurate head tracking in a moving vehicle or on a simulator motion platform. For this purpose we ported the algorithm used by our LPVR-DUO solution to LPVR-AIR. With some adjustments we were able to reach a very similar performance to LPVR-DUO, this time with a wireless setup.

Whereas the video pass-through quality of the Quest and Varjo HMDs are comparable in day and night-time scenarios, the lightness and comfort of a wireless solution is a big advantage. Compatibility with all OpenVR or OpenXR compatible applications on the rendering server makes this solution a unique state-of-the art simulation and prototyping tool for autmotive and aerospace product development.

Release notes

See the release notes for LPVR-AIR 3.3 here.

LPVR New Release 4.9.2 – Varjo XR-4 Controller Integration and Key Improvements

New release LPVR-CAD and LPVR-DUO 4.9.2

As it is with software, our LPVR-CAD and LPVR-DUO products for high-fidelity VR and AR need maintenance updates. Keeping up-to-date with the wide range of supported hardware as well as fixing issues that are discovered necessitates a release every now and then. Our latest release, LPVR-CAD 4.9.2 and LPVR-DUO 4.9.2 is no different. This blog post summarizes the changes in the latest version, LPVR-CAD 4.9.2 and LPVR-DUO-4.9.2.

Support for Varjo XR-4 Controllers

The feature with the highest visibility is support for the hand controllers that Varjo ships with the Varjo XR-4 headset. These controllers are tracked by the headset itself, and Varjo Base 4.4 adds an opt-in way of supporting them with LPVR-CAD. Varjo does not enable the controllers by default because the increased USB traffic can negatively affect performance on some systems, and so an LPVR user has to decide whether the added support is worth it on their system. Of course, we also continue supporting the SteamVR controllers together with LPVR-CAD. We detailed their use with the XR-4 in our documentation.

To enable the Varjo controllers in LPVR-CAD, first open Varjo Base. Then navigate to the System tab in Varjo Base. When LPVR-CAD is configured you will find a new input field, depicted below.

Setting its value to “true” will enable controller support, and “false” will disable it. After changing the value, scroll down to the Submit button and click it to effect the change. Varjo also recommends restarting Base after making this change.

Please note that this input is handled by Varjo Base itself, and so this button will also appear in older versions of Varjo Base, for reasons that are too broad to go into here. Providing this support quickly had higher priority to Varjo and us than polish. One issue that can cause confusion is that the Varjo Home screen will not display the controllers, at least in Varjo Base 4.4.0. Unity applications will have to be updated to a recent version of the Varjo plugin. Varjo is working on improving these issues.

 

Updated Support for JVC HMD-VS1W

An interesting AR headset of see-through type is JVC’s HMD-VS1W. It is a niche product which is typically used in the aeronautical sector. This is a headset which uses Valve tracking with a few custom twists. With a recent software update on their side (version 1.5.0) compatibility with LPVR was broken, but it was easy enough to restore and we have recovered full compatibility.

 

Various other changes

One of the key points when creating an immersive VR and AR experience is that the motion should appear as smooth as possible. We are therefore constantly refining our algorithms to meet that goal. This release significantly improves the smoothness of rotations, especially for Varjo’s third-generation headsets such as the Varjo Aero and the Varjo XR-3.

We fixed a condition where under some circumstances LPVR-DUO would crash after calibrating the platform IMU. This was related to a multi-threading issue which caused a so-called deadlock in the driver.

We also added support for a global configuration of our SteamVR driver which can be overridden by local users. Since automatic support for this requires major changes to our installers and uninstallers, we decided to postpone enabling this feature by default. Please get in touch if that is something you want to use already.

We often recommended the so-called “freeGravity” feature to our users to improve visual performance in most circumstances. We changed the default for this setting to match the needs for the most common use cases.

 

*Important note for LPVR-CAD, LPVR-DUO users with Varjo headsets:

Customers who initially purchased LPVR-CAD, LPVR-DUO for the Varjo XR-3 and wish to upgrade to the XR-4 must purchase a separate license upgrade to ensure compatibility. Orders placed before 2024 only cover up to Varjo XR-3 HMD, and orders made from 2024 will cover up to Varjo XR-4 HMD.

We recommend reviewing your maintenance coverage and hardware plans before making upgrades or deploying LPVR across multiple locations. For questions, feel free to contact our support team.

Accurate Mixed Reality with LPVR-CAD and Varjo XR-3

Anchoring Virtual Objects with Varjo XR-3

A key aspect of making a mixed reality experience compelling and useful is to correctly anchor virtual content to the real world. An object that’s fixed in its position and orientation (pose) relative to the real world should not change its pose as seen by the user when they’re moving around in the scene wearing the headset. Imagine a simple virtual cube positioned to be sitting on a real table. As the user walks around, this cube should remain in place, regardless from which perspective the user looks at it.

In more detail, correct anchoring of virtual objects to reality depends on the following points:

  1. In order to create correctly aligned content in a mixed reality experience, accurate knowledge of the user’s head pose is essential. We calculate the head pose using LPVR-CAD.
  2. The field of view provided by the cameras and the natural field of view of the human eyes are very different. Appropriate calibration is needed to compensate for this effect. The illustration below shows this inherent problem of video pass-through very clearly. Varjo’s HMDs are factory calibrated to minimize the impact of this effect on the user experience.

To get a better idea how a correct optical see-through (OST) calibration influences mixed reality performance we recommend playing with the camera configuration options in Varjo Lab Tools.

– Image credit: Varjo mixed reality documentation

Functional Testing of MR Performance

Our LPVR solution must at the very least achieve a precision that is satisfactory for our users’ typical applications. Therefore we decided to do a series of experiments to evaluate the precision of our system for mixed reality experiences and how it compares with SteamVR Lighthouse tracking.

We looked at the following configurations to make our evaluation:

 

# HMD Engine Tracking system
1 Varjo XR-3 Unreal 5.2 LPVR-CAD
2 Varjo XR-3 Unreal 5.2 Lighthouse

1 – LPVR Tracking System

In this scenario we fixed a simple virtual cube on a table top in a defined position and orientation. The cube is the only virtual object in this scenario, everything else is the passed-through live video feed from the HMD cameras. We are tracking the HMD using LPVR-CAD in connection with an ART Smarttrack 3. No markers are used to stabilize the pose of the cube.

Important note: The tracking performance for the mixed reality use case can significantly change if the marker target that is attached to the HMD is not correctly adjusted. Please refer to the LPVR-CAD documentation or contact us for further support.

2 – Lighthouse Tracking System

This scenario is in its basic setup identical to scenario #1, except that we’re using Lighthouse tracking to find the pose of the HMD.

Conclusion

See a table with our preliminary findings below:

LPVR Tracking Lighthouse Tracking
2 2.5
1.5 1.5

– Approximate displacement error on horizontal plane (in cm)

Using the same room setup and test scene, the mixed reality accuracy of LPVR-CAD and Lighthouse tracking is similar. With both tracking systems slight shifts of 1-2cm depending on the head movement can be observed. Good results with LPVR tracking require precise adjustment of the optical target attached to the headset.

Note that our method of estimating the displacement error is rather qualitative than quantitative. With this post, we made a general comparison of LPVR and Lighthouse tracking. A more quantitatively accurate evaluation will follow.

For customers wanting to reduce as much drift as possible, we recommend the use of optical tracking. There may be different results using Varjo XR-4 and other variations on the tracking environment or displayed content, which could warrant further testing in the future.

*We provide complete solutions with cutting-edge tracking systems and content for a variety of headsets. For detailed information on supported HMD models, please reach out to us.

1 2 3 15