About Klaus Petersen

I like to create magical things, especially projects related to new technologies like augmented and virtual reality, mobile robotics and MEMS-based sensor networks. I code in C(++) and Python, trying to keep up with my very talented colleagues :-)

LPMS Operator’s Manual Update

It’s been a long time, but finally we have updated our reference manual to the latest generation of sensors.

The manual is accessible through our documentation & support page or directly from here.

Below is a list of the most important updates, some of which are fixes that customers have asked for for quite a while:

  • Removed hardware specific parts. These are now covered in the quick start manuals.
  • Corrected scaling factors for all non-floating-point data transmission modes.
  • Corrected error in description of reset modes.
  • Moved to-be-deprecated LpSensor detail description to appendix.
  • Added list with APIs for direct sensor programming. OpenZen is to replace LpSensor.

Machine Learning for Context Analysis

Deterministic Analysis vs. Machine Learning

Machine learning and artificial intelligence (AI) are important methods that allow machines to classify information about their environment. Today’s smart devices integrate an array of sensors that constantly measure and save data. On the first thought one would image that the more data is available, the easier it is to draw conlusions from this information. But, in fact larger amounts of data become harder to analyze using deterministic methods (e.g. thresholding). Whereas such methods by themselves can work efficiently, it is difficult to decide which analysis parameters to apply to which parts of the data.

Using machine learning techniques on the other hand this procedure of finding the right parameters can be greatly simplified. By teaching an algorithm which data corresponds to a certain outcome using training and verification data, analysis parameters can be determined automatically or at least semi-automatically. There exists a wide range of machine learning algorithms including the currently very popular convolutional neural networks.

Context analysis setup overview

Context Analysis

Many health care applications rely on the correct classification of a user’s daily activities, as these reflect strongly his lifestyle and possibly involved health risks. One way of detecting human activity is monitoring their body motion using motion sensors such as gyroscopes, accelerometers etc. In the application described here we monitor a person’s mode of transportation, specifically

  1. Rest
  2. Walking
  3. Running
  4. In car
  5. On train

To illustrate the results for deterministic analysis vs. machine learning approach we first implemented a state machine based on deterministic analysis parameters.

Deterministic approach overview

The result is a relatively complicated state machine that needs to be very carefully tuned. This might have been because of our lack of patience, but in spite of our best efforts we were not able to reach detection accuracies of more than around 60%. Before spending a lot more time on manual tuning of this algorithm we switched to a machine learning approach.

Machine learning approach overview

The eventual system structure looks noticeably simpler than the deterministic state machine. Besides standard feature extraction, a central part of the algorithm is the data logging and training module. We sampled over 1 milion of training samples to generate the parameters for our detection network. As a a result, even though we used a relatively simple machine learning algorithm, we were able to reach a detection accuracy of more than 90%. A comparison between ground truth data and classification results from raw data is displayed below.

Context analysis algorithm result

Conclusion

We strongly belief in the use of machine learning / AI techniques for sensor data classification. In combination with LP-RESEARCH sensor fusion algorithms, these methods add a further layer of insight for our data anlysis customers.

If this topic sounds familiar to you and you are looking for a solution to a related problem, contact us for further discussion.

IMUcore Sensor Fusion

Introducing IMUcore

IMUcore is the central algorithm working inside all LP-RESEARCH IMUs. It collects orientation data from several sources and combines them into fast and drift-free tilt and direction information. To work with any type of MEMS sensor, various online and offline calibration methods have been implemented to guarantee high quality data output. The algorithm is very versatile and performance-saving. It can be implemented on embedded MCUs with minimum power consumption.

IMUcore is now available as a solution from LP-RESEARCH. Please contact us here for more information or a price quotation.

Overview of embedded sensor fusion in LPMS devices

Sensor Fusion Filter Overview

IMUcore uses gyroscope data as basis information to calculate orientation. Errors introduced through measurement noise are corrected by accelerometer and compass data. Optionally the sensor fusion can be extended with an optical (or other) tracking system to additionally provide position information.

All aspects of the IMUcore algorithm in one image

If this topic sounds familiar to you and you are looking for a solution to a related problem, contact us for further discussion.

iOS Support for LPMS-B2

LPMS-B2, besides Bluetooth classic, also supports Bluetooth 4 / Bluetooth Low Energy. This allows us to connect the sensor to Apple mobile devices such as the iPad, iPhone or the Apple watch. We recently have created a library that enables development of applications supporting LPMS-B2 on these devices.

The library can be accessed via our open source repository.

The repository contains a skeleton application that shows usage of the most basic parts of the library. The library itself is contained in the following files:

A sensor object is initialized and connected using the follwoing code:

More coming soon..

LPVR Manual & VIVE Pro Holder Prototype

LPVR Quick Start Guide

We have written a quick start guide for the LP VR system. The guide describes the assembly of the VIVE marker holder and the installation/usage of the LPVR SteamVR driver. The guide at the moment doesn’t contain information about the new VIVE Pro holder, but this will be added later.

Download the guide from here: LpvrGettingStarted20180402.pdf

VIVE Pro Sensor Holder

We have been working on the development of a optical marker/sensor holder for the VIVE Pro for a few weeks. It is not completely finished yet, but below is a photo of a prototype.

VIVE Pro with IMU and markers
1 2 3 4 5 6 10