About Klaus Petersen

I like to create magical things, especially projects related to new technologies like augmented and virtual reality, mobile robotics and MEMS-based sensor networks. I code in C(++) and Python, trying to keep up with my very talented colleagues :-)

LPVR-DUO Featured at Unity for Industry Japan Conference

Unity for Industry Conference – XRは次のステージへ

LPVR-DUO has been featured at the Unity for Industry online conference in Japan. TOYOTA project manager Koichi Kayano introduced LPVR-DUO with Varjo XR-1 and ART Smarttrack 3 for in-car augmented reality (see the slide above).

Besides explaining the fundamental functional principle of LPVR-DUO inside a moving vehicle – using a fusion of HMD IMU data, vehicle-fixed inertial measurements and outside-in optical tracking information – Mr. Kayano presented videos of content for a potential end-user application:

Based on a heads-up display-like visualization, TOYOTA’s implementation shows navigation and speed information to the driver. The images below show two driving situations with a virtual dashboard augmentation overlay.

AR Head-Mounted Display vs. Heads-Up Display

This use-case leads us to a discussions of the differences between an HMD-based visualization solution and a heads-up display (HUD) that is e.g. fixed stationary to the top of a car’s console. While putting on a head mounted display does require a minor additional effort by the driver, there are several advantages of using a wearable device in this scenario.

Content can be displayed at any location in the car, from projecting content onto the dashboard, the middle console, the side windows etc. A heads-up display works only in one specific spot.

As the HMD shows information separately to the left and right eye of the driver, we can display three-dimensional images. This allows for accurate placement of objects in 3D space. The correct positioning within the field of view of the driver is essential for safety relevant data. In case of a hazardous situation detected by a car’s sensor array the driver will know exactly where the danger is occurring from.

These are just two of many aspects that set HMD-based augmented reality apart from a heads-up display. The fact that large corporations like TOYOTA are starting to investigate this specific topic shows that the application of augmented reality in the car will be an important feature for the future of mobility.

NOTE: Image contents courtesy of TOYOTA Motor Corporation.

See-through Display First Look – LPVIZ (Part 3)

Virtual Dashboard Demonstration

This is a follow-up post to the introduction of our in-vehicle AR head mounted display LPVIZ part 1 and part 2.

To test LPVIZ we created a simple demo scenario of an automotive virtual dashboard. We created a Unity scene with graphic elements commonly found on a vehicle dashboard. We animated these elements to make the scene look more realistic.

This setup is meant for static testing at our shop. For further experiments inside a moving vehicle we are planning to connect the animated elements directly to car data (speed etc.) communicated over the CAN bus.

The virtual dashboard is only a very simple example to show the basic functionality of LPVIZ. As described in a previous post, many a lot more sophisticated applications can be implemented.

The video above was taken through the right eye optical waveguide display of LPVIZ. We took this photo with a regular smartphone camera and therefore it is not very high quality. Nevertheless, it confirms that the display is working and correctly shows the virtual dashboard.

The user is looking at the object straight ahead. In case the user rotates his head or changes position, his view of the object will change perspectively. An important point to mention is the high luminosity of the display. We took this photo with the interior lighting in our shop turned on normally, and without any additional shade in front of the display.

How to Use LPMS IMUs with LabView

Introduction

LabView by National Instruments (NI) is one of the most popular multi-purpose solutions for measurement and data acquisition tasks. A wide range of hardware components can be connected to a central control application running on a PC. This application contains a full graphical programming language that allows the creation of so called virtual instruments (VI).

Data can be acquired inside a LabView application via a variety of communication interfaces, such as Bluetooth, serial port etc. A LabView driver that can communicate with our LPMS units has been a frequently requested feature from our customers for some time, so that we decided to create this short example to give a general guideline.

A Simple Example

The example shown here specifically works with LPMS-B2, but it is easily customizable to work with other sensors in our product line-up. In order to communicate with LPMS-B2 we use LabView’s built-in Bluetooth access modules. We then parse the incoming data stream to display the measured values.

The source code repository for this example is here.

Figure 1 – Overview of a minimal virtual instrument (VI) to acquire data from LPMS-B2

Fig. 1 shows an overview of the example design to acquire the accelerometer X, Y, Z axes of the IMU and displays them on a simple front panel. Fig. 2 & 3 below show the virtual instrument in more detail. After reading out the raw data stream from the Bluetooth interface, this data stream is converted into a string. The string is then evaluated to find the start and stop character sequence. The actual data is finally extracted depending on its position in the data packet.

Figure 2 – Bluetooth access and initial data parsing

Figure 3 – Extraction of timestamp, accelerometer X, Y, Z values

Notes

Please note that the example requires manually entering the Bluetooth ID of the LPMS-B2 in use. The configuration of the data parsing is static. Therefore the output data of the sensor needs to be configured and saved to sensor flash memory in the LPMS-Control application. For reference please check the LPMS manual.

An initial version of this virtual instrument was kindly provided to us by Dr. Patrick Esser, head of the Movement Science Group at Oxford Brooks University, UK.

Collaboration with Pimax

We are happy to announce a collaboration with the head-mounted display (HMD) manufacturer Pimax. Pimax HMDs feature very high resolution (up to 8K pixels) displays and an industry-leading field-of-view (max. 200°). By default, Pimax HMDs support SteamVR tracking and therefore are limited to relatively small tracking volumes.

We developed a special driver that allows our LPVR middleware LPVR-CAD and LPVR-DUO to work with Pimax headsets. Using LPVR, the headsets can now be used within a large-scale, location-based context, in connection with outside-in optical systems such as ART (Advanced Real-Time Tracking).

As Pimax is planning to implement UltraLeap hand tracking in their HMDs in the future, we are confident that we will also be able to extend our inside-out tracking algorithm to their devices.

The video above shows the basic functionality of tracking a Pimax HMD using LPVR and an optical tracking system. The headset’s motions are represented in SteamVR. For this demonstration the tracking volume is relatively small, but can be extended easily by using more outside-in tracking cameras.

This video was kindly provided to us by evoTec Solutions. Evotec is a new company in Switzerland that focuses on virtual reality (VR) solutions for corporations. Contact them for further information!

Design Prototype and Inside-out Tracking – LPVIZ (Part 2)

LPVIZ Prototype Industrial Design

This post is a follow-up to the introduction of our augmented reality (AR) headset LPVIZ. See our previous post here.

For the past two months the LPVIZ team has been working hard to improve our initial prototype. We have enhanced the device’s appearance and optimized it ergonomically. My colleague Seeon Mitchel has made draft 3D prints of the design that he has been planning for the initial release of LPVIZ. The results are looking excellent (Figure 1 & 2).

The ring design for fixing the unit to the user’s head feels comfortable. Even for longer usage duration the unit does not cause fatigue to the neck. See below two photos of the current functional prototype with the newly printed shell.

Figure 1, 2 – The fully functional LPVIZ design prototype

Inside-out Tracking and Gesture Recognition

The latest LPVIZ prototype features a built-in stereo camera. We are using the excellent Rigel module by the company UltraLeap that allows us to, at the same time, run a SLAM (simultaneous localization and mapping) algorithm and UltraLeap’s hand tracking.

Using the Rigel’s stereo camera, my colleague Thomas Hauth has developed a state-of-the-art inside-out tracking algorithm that allows the headset to be used inside a vehicle, even if no special cameras are installed. The video (Figure 3) below shows the fundamental functionality of the algorithm.

Figure 3 – The video shows the fundamental functionality of the LPSLAM inside-out tracking algorithm

It is important to note that this will not be a full replacement for ART outside-in tracking inside the vehicle. ART’s tracking engine is more accurate and robust under difficult lighting conditions. Still, our purpose is to also serve customers that have a smaller budget or no possibility to install additional equipment inside their vehicle.

1 2 3 4 9