About Klaus Petersen

I like to create magical things, especially projects related to new technologies like augmented and virtual reality, mobile robotics and MEMS-based sensor networks. I code in C(++) and Python, trying to keep up with my very talented colleagues :-)

LP Sensors at Bandai Namco VR Zone: Ghost in the Shell VR Experience

Shinjuku VR Zone logo

Recently Bandai Namco has opened a VR entertainment center in central Tokyo, titled the VR Zone. One part of VR Zone is a revolutionary new free-roaming arcade experience built on the theme of anime classic Ghost in the Shell.

VR Zone group photo

LP-RESEARCH has provided IMU technology that allows the system to operate with the HTC VIVE. The result is a uniquely fascinating and fun arcade experience. Our team was very happy to test the experience before the official openning of VR Zone and greatly enjoyed it. VR at its finest, highly recommended!

IMU-based Dead Reckoning (Displacement Tracking) Revisited

In a blog post a few years ago, we published results of our experiments with direct integration of linear acceleration from our LPMS-B IMU. At that time, although we were able to process data in close to real-time, displacement tracking only worked on one axis and for very regular up-and-down motions.

In the meantime the measurement quality of our IMUs has improved and we have put further work into researching dead reckoning applications. Fact is that still, low-cost MEMS as they are used in our LPMS-B2 devices are not suitable to perform displacement measurement for extended periods of time or with great accuracy. But, for some applications such as sports motion measurement or as one component in a larger sensor fusion setup, the results are very promising.

A further experiment shows this algorithm applied to the evaluation of boxing motions. This system might work as a base component for IoT boxing gloves that allow automatic evaluation of an athletes technique and strength, or it might ne integrated into an advanced controller for virtual reality sports.

As usual, please contact us for further information.

Meet Xikaku

We are proud to present our new partner company Xikaku. Xikaku is a US company located in Los Angeles, focusing on the development of technology related to the field of augmented reality (AR). Visit their website here.

Location-based VR Tracking Solution

LPVR Location-based VR Tracking Introduction

NOTE: In case you are looking for our LPVR middleware for automotive and motion simulator applications, please refer to this page.

UPDATE 1: Full SteamVR platform support

UPDATE 2: Customer use case with AUDI and Lightshape

UPDATE 3: Customer use case with Bandai Namco

Consumer virtual reality head mounted display (HMD) systems such as the HTC VIVE support so-called room scale tracking. These systems are able to track head and controller motion of a user not only in a sitting or other stationary position, but support free, room-wide motions. The volume of this room scale tracking is limited to the capabilities of the specific system, usually covering around 5m x 5m x 3m. Whereas for single user games or applications this space may be sufficient, especially multi-user, location-based VR applications such as arcade-style game setups or enterprise applications require larger tracking volumes.

Optical tracking systems such as Optitrack offer tracking volumes of up to 15m x 15m x 3m. Although the positioning accuracy of optical tracking systems are in the sub-millimeter range, especially orientation measurement is often not sufficient to provide an immersive experience to the user. Image processing and signal routing may introduce further latencies.

Our locations-based VR / large room-scale tracking solution solves this problem by combining optical tracking information with inertial measurement data using a special predictive algorithm based on a head motion model.

Compatible HMDs: HTC VIVE, HTC VIVE Pro
Compatible optical tracking systems: Optitrack, VICON, ART, Qualisys, all VRPN-compatible tracking systems
Compatible software: Unity, Unreal, Autodesk VRED, all SteamVR-compatible applications

This location-based VR solution is now available from LP-RESEARCH. Please contact us here for more information or a price quotation.

IMU and Optical Tracker Attachment

The system follows each HMD using a combination of optical and IMU tracking. A special 3D-printed holder is used to attach a high-quality IMU (LPMS-CU2) and optical markers to an HTC VIVE headset.

HTC VIVE with LP holder and IMU attached

To create a perfectly immersive experience for the user, optical information is augmented with data from an inertial measurement unit at a rate of 800Hz. Additionally to the high frequency / low-latency updates, a head motion model is used to predict future movements of the player’s head. This creates an impression of zero latency gameplay.

Overview of solution key functionality

Tracking Camera, HMD and Player Setup

In its current configuration the system can track up to 20 actors simultaneously, each holding a VIVE controller to interact with the environment. Players wear backpack PCs to provide visualization and audio.

Complete location-based VR system setup

Playground and Camera Arrangement

The solution covers an area of 15m x 15m or more. There is an outer border of 1.5m that is out of the detection range of the cameras. This results in an actual, usable playground area of 13.5m x 13.5m. The overall size of the playground is 182.25m². The cameras are grouped around the playground to provide optimum coverage of the complete area.

Location-based VR playground setup

 

Contact us for further information.

Optical-Inertial Sensor Fusion

Optical position tracking and inertial orientation tracking are well established measurement methods. Each of these methods has its specific advantages and disadvantages. In this post we show an opto-inertial sensor fusion algorithm that joins the capabilities of both to create a capable system for position and orientation tracking.

How It Works

The reliability of position and orientation data provided by an optical tracking system (outside-in or inside-out) can for some applications be compromised by occlusions and slow system reaction times. In such cases it makes sense to combine optical tracking data with information from an inertial measurement unit located on the device. Our optical-intertial sensor fusion algorithm implements this functionality for integration with an existing tracking system or for the development of a novel system for a specific application case.

The graphs below show two examples of how the signal from an optical positioning system can be improved using inertial measurements. Slow camera framerates or occasional drop-outs are compensated by information from the integrated inertial measurement unit, improving the overall tracking performance.

Combination of Several Optical Trackers

For a demonstration, we combined three NEXONAR IR trackers and an LPMS-B2 IMU, mounted together as a hand controller. The system allows position and orientation tracking of the controller with high reliability and accuracy. It combines the strong aspects of outside-in IR tracking with inertial tracking, improving the system’s reaction time and robustness against occlusions.

Optical-Inertial Tracking in VR

The tracking of virtual reality (VR) headsets is one important area of application for this method. To keep the user immersed in a virtual environment, high quality head tracking is essential. Using opto-inertial tracking technology, outside-in tracking as well as inside-out camera-only tracking can be significantly improved.

1 3 4 5 6 7 10