We mentioned before that our new AR development platform is in the making. It hasn’t only generated quite a lot of attention at the recent Slush and Tech in Asia startup fairs. We often have people interested in what we do visiting our office for a demo. Last week, for example, Helmut Wenisch, Head of Corporate Technology at Siemens K.K., and Alok Kumar Dubey of Siemens K.K. were visiting us to experience our prototype first-hand. Thank you so much for coming by and the inspiring conversation!
The folks at Google ATAP were so nice and allowed us to participate in the Project Soli alpha developer program. Please have a look at their website for more information about the project. Project Soli is a chip-sized miniature millimeter-wave radar, supported by a sophisticated DSP pipeline developed by Google. Based on this signal processing, it is possible to analyze and evaluate finger gestures in the vicinity of the sensor. This allows for new ways of human-device interaction.
We have spent some time with the developer kit and made an application called Virtual Tape Measure. Purpose of this demo application is to replace the need for a physical tape measure when e.g. checking the dimensions of table while shopping for furniture. This is a fairly simple application of the Soli technology. We are currently looking into further, more complex use cases. Please see the diagram below describing the basic functionality of the system.
In order to test the functionality of our sensor fusion algorithm for head-mounted-display pose estimation, we connected one of our IMUs (LPMS-CURS2), a Nexonar infrared (IR) beacon and a LCD display to a Baofeng headset. The high stability of the IR tracking and the orientation information from the IMU as input to the sensor fusion algorithm result in accurate, robust and reactive headtracking. See the figure below for details of the test setup. The video shows the resulting performance of the system.
Position tracking based on pure linear acceleration measurements is a difficult problem. To result in actual position values, linear acceleration (i.e. data from an accelerometer minus gravity) needs to be integrated twice. If there is only a minimal bias on the data of one of the tracked axes, the resulting position values will rapidly drift off.
Although it is well possible to increase the performance of such positioning information by sensor fusion with external reference signals (optical system, barometric pressure etc.), in many cases direct forward calculation of position from linear acceleration is required.
Lately we have been working on gradually improving our linear acceleration measurements in accuracy and tried to tune these measurements with various filters in order to gain relatively reliable displacement information.
The video below shows an exmaple of displacement tracking on the vertical axis using an LPMS-B device. Except for the sensor’s gyroscope, accelerometer and magnetometer, no external references have been used.
We had the opportunity to try out one of the new augmented reality glasses AiRScouter produced by the Japanese company Brother. We first tried one at a Brother product exhibition here in Tokyo. Although the glasses are a little heavier than normal glasses, they fit quite well and the overlay image is well visible.
We experimented with the glasses a bit and set-up a prototype application for augmented reality using our LPMS-B sensor for head tracking, codename: LpGlass. The video below shows a demo of our LPMS-B IMU attached to the AiRScouter.
Similar to the Google glasses there seem to be a huge number of applications, especially for augmenting task environments for medical procedures, industrial assembly, education etc.