About Klaus Petersen

I like to create magical things, especially projects related to new technologies like augmented and virtual reality, mobile robotics and MEMS-based sensor networks. I code in C(++) and Python, trying to keep up with my very talented colleagues :-)

Introducing LP-Research’s SLAM System with Full Fusion for Next-Gen AR/VR Tracking

At LP-Research, we have been pushing the boundaries of spatial tracking with our latest developments in Visual SLAM (Simultaneous Localization and Mapping) and sensor fusion technologies. Our new SLAM system, combined with what we call “Full Fusion,” is designed to deliver highly stable and accurate 6DoF tracking for robotics, augmented and virtual reality applications.

System Setup

To demonstrate the progress of our development, we ran LPSLAM together with FusionHub on a host computer and forwarded the resulting pose to a Meta Quest 3 mixed reality headset for visualization using LPVR-AIR. We created a custom 3D-printed mount to affix the sensors needed for SLAM and Full Fusion, a ZED Mini stereo camera and an LPMS-CURS3 IMU sensor onto a the headset.

This mount ensures proper alignment of the sensor and camera with respect to the headset’s optical axis, which is critical for accurate fusion results. The system connects via USB and runs on a host PC that communicates wirelessly with the HMD. An image of how IMU and camera are attached to the HMD is shown below.

In the current state of our developments we ran tests in our laboratory. The images below show a photo of the environment next to how this environment translates into an LPSLAM map.

Tracking Across Larger Areas

A walk through the office based on a pre-built map yields good results. The fusion in this experiment is our regular IMU-optical fusion and therefore doesn’t support translation information with integrating accelerometer data. This leads to short interruptions of position tracking in certain areas where feature points aren’t found. We at least partially solve this problem with the full fusion shown in the next paragraph.

What is Full Fusion?

Traditional tracking systems rely either on Visual SLAM or IMU (Inertial Measurement Unit) data, often with one compensating for the other. Our Full Fusion approach goes beyond orientation fusion and integrates both IMU and SLAM data to estimate not just orientation but also position. This combination provides smoother, more stable tracking even in complex, dynamic environments where traditional methods tend to struggle.

By fusing IMU velocity estimates with visual SLAM pose data through a through a specialized filter algorithm, our system handles rapid movements gracefully and removes jitter seen in pure SLAM-only tracking. The IMU handles fast short-term movements while SLAM ensures long-term positional stability. Our latest releases even support alignment using fiducial markers, allowing the virtual scene to anchor precisely to the real world. The video below shows the SLAM in conjunction with the Full Fusion.

Real-World Testing and Iteration

We’ve extensively tested this system in both lab conditions and challenging real-world environments. Our recent experiments demonstrated excellent results. By integrating our LPMS IMU sensor and running our software pipeline (LPSLAM and FusionHub), we achieved room-scale tracking with sub-centimeter accuracy and rotation errors as low as 0.45 degrees.

In order to evaluate the performance of the overall solution we compared the output from FusionHub with pose data recorded by an ART Smarttrack 3 tracking system. The accuracy of an ART tracking system is in the sub-mm range and therefore is sufficienty accurate to characterize the performance of our SLAM. The result of one of several measurement runs is shown in the image below. Note that both systems were alignment and timestamp synchronized to correctly compare poses.

Developer-Friendly and Cross-Platform

The LP-Research SLAM and FusionHub stack is designed for flexibility. Components can run on the PC and stream results to an HMD wirelessly, enabling rapid development and iteration. The system supports OpenXR-compatible headsets and has been tested with Meta Quest 3, Varjo XR-3, and more. Developers can also log and replay sessions for detailed tuning and offline debugging.

Looking Ahead

Our roadmap includes support for optical flow integration to improve SLAM stability further, expanded hardware compatibility, and refined UI tools for better calibration and monitoring. We’re also continuing our efforts to improve automated calibration and simplify the configuration process.

This is just the beginning. If you’re building advanced AR/VR systems and need precise, low-latency tracking that works in the real world, LP-Research’s Full Fusion system is ready to support your journey.

To learn more or get involved in our beta program, reach out to us.

Wireless Mixed Reality with LPVR-AIR 3.3 and Meta Quest

Achieving Accurate Mixed Reality Overlays

In a previous blog post we’ve shown the difficulties of precisely aligning virtual and real content using the Varjo XR-3 mixed reality headset. In spite of the Varjo XR-3 being a high quality headset and accurate tracking using LPVR-CAD we had difficulties reaching correct alignment for different angles and distances from an object. We concluded that the relatively wide distance between video passthrough cameras and the displays of the HMD causes distortions that are hard to be corrected by the Varjo HMD’s software.

Consumer virtual reality headsets like the Meta Quest 3 have only recently become equipped with video passthrough cameras and displays that operate at similar image quality as the Varjo headsets. We have therefore started to extend our LPVR-AIR wireless VR software with mixed reality capabilities. This allows us to create similar augmented reality scenarios with the Quest 3 as with the Varjo XR series HMDs.

Full MR Solution with LPVR-AIR and Meta Quest

The Quest 3 is using pancake optics that allow for a much closer distance between passthrough cameras and displays. Therefore the correction of the camera images the HMD has to apply to align virtual and real content accurately is reduced. We show this in the video above. We’re tracking the HMD using our LPVR-AIR sensor fusion and an ART Smarttrack 3 outside-in tracking system. Even though the tracking accuracy we can reach with the tracking camera placed relatively far away from the HMD is limited, we achieve a very good alignment between the virtual cube and real cardboard box, even with varying distances from the object.

This shows that using a consumer grade HMD like the Meta Quest 3 with a cost-efficient outside-in tracking solution a state-of-the-art mixed reality setup can be achieved. The fact that the Quest 3 is wirelessly connected to the rendering server adds to the ease-of-use of this solution.

The overlay accuracy of this solution is superior to all other solutions on the market that we’ve tried. Marker-based outside-in tracking guarantees long-term accuracy and repeatability, which is usually an inssue with inside-out or Lighthouse-based tracking. This functionality is supported from LPVR-AIR version 3.3.

Controller and Optical Marker Tracking

In addition to delivering high-quality mixed reality and precise wireless headset tracking, LPVR-AIR seamlessly integrates controllers tracked by the HMD’s inside-out system with objects tracked via optical targets in the outside-in tracking frame, all within a unified global frame. The video above shows this unique capability in action.

When combined with our LPVR-CAD software, LPVR-AIR enables the tracking of any number of rigid bodies within the outside-in tracking volume. This provides an intuitive solution for tracking objects such as vehicle doors, steering wheels, or other cockpit components. Outside-in optical markers are lightweight, cost-effective, and require no power supply. With camera-based outside-in tracking, all objects within the tracking volume remain continuously tracked, regardless of whether the user is looking at them. They can be positioned with millimeter accuracy and function reliably under any lighting conditions, from bright daylight to dark studio environments.

In-Car Head Tracking with LPVR-AIR

After confirming the capability of LPVR-AIR to work well with large room scale mixed reality setups, we started developing the system’s functionality to do accurate head tracking in a moving vehicle or on a simulator motion platform. For this purpose we ported the algorithm used by our LPVR-DUO solution to LPVR-AIR. With some adjustments we were able to reach a very similar performance to LPVR-DUO, this time with a wireless setup.

Whereas the video pass-through quality of the Quest and Varjo HMDs are comparable in day and night-time scenarios, the lightness and comfort of a wireless solution is a big advantage. Compatibility with all OpenVR or OpenXR compatible applications on the rendering server makes this solution a unique state-of-the art simulation and prototyping tool for autmotive and aerospace product development.

Release notes

See the release notes for LPVR-AIR 3.3 here.

LPVR-AIR for Immersive Collaborative Industrial Design

Wireless Content Streaming

LPVR-AIR is LP-Research’s wireless VR streaming solution. Content is generated on a rendering computer and wirelessly streamed to a VR headset to be displayed. At the same time the pose, orientation and position, of the headset is calculated from tracking data from a camera system and inertial measurements on the headset itself.

The core tracking algorithm of LPVR-AIR is similar to our LPVR-CAD solution. We are combining this established tracking method with wireless data streaming.

This has a few significant advantages:

  • Rendering detailed VR content is computationally too heavy to do all calculations on embedded hardware on the headset itself. Therefore, content needs to be rendered on an external computer and the result is streamed to the headset. LPVR-AIR allows doing this.
  • Designers in eg. the automotive space have their own preference of applications to create content, such as Autodesk VRED. These applications usually don’t run on a headset’s embedded hardware. With LPVR-AIR dsigners can use any application that normally works with a Windows based PC.

Technical Implementation

See below a block diagram of how the LPVR-AIR system is implemented. While we in principle support any Android based standalone VR headset, we currently focus on the Meta Quest line of HMDs, specifically Meta Quest Pro and Meta Quest 3.

Our solution effectively enables designers to explore a large 3D design space with full high resolution renderings using a lightweight headset. LPVR-AIR even allows for the interaction of several users in a design space. An example of such a use case is shown in the video on the top of this post. Two users in our office in Tokyo, being tracked by LPVR tracking, explore a car design together.

Improved Design Process

This opens new possibilities for automotive, industrial, architecture and many more design applications, leading to increased performance of designers and a higher success rate of their designs. LPVR-AIR is based on the ALVR wireless streaming engine, which we have extended to work with our FusionHub sensor fusion solution.

Long term, the ALVR engine makes it easy for us to support a number of different HMDs, additionally to Meta Quest also the Varjo and eventually the Apple Vision Pro series as shown in the image below. With VRED we have an outstanding rendering solution at the base of LPVR-AIR that allows designers to create photo-realistic content while providing extensive collaboration abilities.

If you would like to move towards immersive and interactive 3D design, don’t hesitate to consult with us and give our LPVR-AIR on-premise collaborative design solution a try!

Immersive Driving Assistance with LPVIZ

How LPVIZ Augments Driving Reality

Going beyond a simple screen replacement, LPVIZ is an augmented reality driving assistance solution for the car. It allows displaying related content to a driver or passenger in 3D, superimposed to reality. Content can be placed anywhere inside the car, such as a virtual speedometer over the dashboard, and anywhere outside of the car, such as point-of-interest markers or navigation guidance.

The video on top of this post shows what a drive around the block in Azabujuban, Tokyo with LPVIZ looks like. A virtual dashboard is projected onto the center console of the vehicle. Arrows on the ground show lane guidance to the driver. Red Google Maps-style markers show points of interest. The virtual dashboard stays fixed to the same location in the car, even when the vehicle turns. The navigation arrows move smoothly and the point-of-interest markers are globally anchored.

Perfectly Tuned Components

LPVIZ consists of several components that all have to interact perfectly to create a compelling and safe augmentation experience. The below illustration shows a block diagram of how the hardware components are connected.

Accurate tracking is required to display useful content to the driver: the HMD pose in the local car coordinate system and the vehicle pose in a globally anchored frame. Precise calibration of all components of the solution is essential to provide the highest visual fidelity and driver safety. Our LPVIZ product makes all parts of the system available in a compact form factor, ready to be integrated with any vehicle.

The Past, Present and the Future

In the current development stage we’re focusing on the most essential aspects of the solution: displaying a virtual dashboard, navigation information and points-of-interest. While this is our proprietary content, we’re opening our software to work with 3rd party developers to create their own content building on our platform.

Currently we’re offering LPVIZ as a B2B solution for prototyping, design and research. However, we’re working on reducing system complexity to make it work as a consumer facing automotive after-market solution to be released later this year.

Towards a Consumer Product

We are very proud of the progress our team has made in the past months. We’re moving closer to making our vision of an augmented reality driving assistance system a reality for everyone. One very important take-away from our recent developments is that it’s indeed possible to provide real utility to the driver using technology that is readily available. It might still be early days, but we’re edging towards a product that could appeal to a wider consumer market. This is just the beginning.

High-performance Use Cases of LPVR & Varjo Headsets

Components of a VR/AR Operating System

Augmented and virtual reality technology helps boost worker productivity in various fields such as automotive, aerospace, industrial production, and more. Whereas the context of these applications is usually fairly specific, some aspects are common to many of these use cases. In this article, we will specifically explore the topic of pose tracking of Varjo head mounted displays (HMDs) based on LP-RESEARCH’s LPVR operating system. We will further on show two customer use cases that utilize LPVR in different ways.

In a typical VR/AR setup, you find three main subsystems as shown in the illustration below:

With our LPVR operating system, we connect these three building blocks of an VR/AR system and make them communicate seamlessly with each other while providing a simple, unified interface to the user. Depending on the specific use case, users might select different types of hardware to build their VR/AR setup. Therefore LPVR offers a wide range of interface options to adapt to systems from various manufacturers.

LPVR Flavors

LPVR operates in different flavors, we can group end applications into two categories:

  • LPVR-CAD – Static AR/VR setups, where multiple users operate and collaborate in one or more joint tracking volumes. These tracking volumes can be situated in different locations.
  • LPVR-DUO – AR/VR systems that are located in a vehicle or on a motion platform: such systems have special requirements, especially on the tracking side. If, for example, you would want to track a headset inside a car, displaying a virtual cockpit anchored to the car frame, and a virtual outside world fixed to a global coordinate system, means of locating the car in the world and referencing the HMD locally in the car frame are required.

 

In the following paragraphs, we will introduce two customer use cases that cover these two basic scenarios.

Large-scale Industrial Design at Hyundai

– Varjo XR-3 at Hyundai Design Center with optical markers attached. Image credit: Hyundai

For the Korean automotive company Hyundai Motor Company, we created a large, location-based virtual reality installation at their research and development center in Namyang, Korea. The system is used to showcase, amend and modify prototype and production-ready automobile designs.

This application uses optical outside-in tracking and LP-RESEARCH’s LPVR-CAD solution to track up to 20 users wearing head-mounted displays. While LPVR allows a mix of different headset types to operate in the same tracking volume, the Varjo XR-3 gives the most outstanding performance to inspect objects in high resolution and great detail. Additionally to an HMD, users carry hand controllers for a total of more than 40 tracked objects in a space close to 400 sqm.

– Hyundai’s collaborative virtual reality design experience. Image credit: Hyundai

Responsiveness is achieved by using LPVR-CAD to combine data from the inertial measurement unit built into the headsets and information from the optical tracking system. The optical system uses 36 infrared cameras to track the 160 markers attached to the HMDs and hand controllers. Perfectly smooth and uninterrupted position and orientation data of each user’s HMD is achieved by using LP-RESEARCH’s sensor fusion algorithms.

Depending on the type of headset, users either wear a backpack PC, connect to a host wirelessly or use an extra-long cable to connect directly to a rendering PC outside the tracking volume.

“Currently, we are actively utilizing VR from the initial development stage to the point of development. In the future, we plan to increase accessibility and usability by simplifying equipment using wireless HMDs. For this, improving the speed and stability of wireless internet is essential, which we plan to address by introducing 5G. In addition, LP RESEARCH’s technology is essential for multi-user location sharing within a virtual space.” – SungMook Kang, Visualization Specialist, Hyundai Motor Company

Next-level Automotive Entertainment with CUPRA

Imagine.. playing Mario cart, your hands are gripping the wheel, and you are in Neo Tokyo, on a race track. Futuristic buildings keep flying by while you race ahead, drifting into long turns and leaving your competitors behind you.

Now imagine you are no longer in your living room, you are sitting in an actual race car, buzzing around in an empty parking lot. Instead of looking through the windshield with your own eyes, you are wearing a Varjo XR-3 HMD. What you see outside the car is a virtual world, it’s Neo Tokyo.

– The view through the Varjo XR-3 headset. Image credit: CUPRA

As the car moves on the parking lot, you move inside the virtual world. When you move your head inside the car’s cockpit, the motions of your head are accurately tracked.

– Varjo XR-3 inside the cabin of the Urban Rebel. Image credit: Fonk Magazine

– Cupra’s Urban Rebel drifting on the test course

Together with the Norwegian company Breach VR, we have implemented this experience for the automotive company CUPRA. CUPRA is relentlessly pushing the technology of their vehicles into the future, striving to provide a novel driving experience to their customers.

Tracking of the vehicle and the Varjo XR-3 inside the vehicle is achieved with LP-RESEARCH’s automotive tracking systems LPVR-DUO. As the headset’s gyroscope sensors record the superimposed motion data of the car and the user inside the car, a specialized sensing system, and the algorithm are required to separate the two.

The result of this cascade of exceptional technology is a compellingly immersive driving experience of the future. The combination of an outstanding visualization device like the Varjo XR-3, LPVR state-of-the-art tracking, BreachVR’s 3D software and design and, last but not least, the incredible CUPRA race cars make for an exciting ride that you’ll greatly enjoy and never forget. Come and join the ride!

Check this blog blog post in the Varjo Insider Blog.

Check out our Instagram for further use cases with Varjo’s HMDs: @lpresearchinc

1 2 3 10