Anchoring Virtual Objects with Varjo XR-3

A key aspect of making a mixed reality experience compelling and useful is to correctly anchor virtual content to the real world. An object that’s fixed in its position and orientation (pose) relative to the real world should not change its pose as seen by the user when they’re moving around in the scene wearing the headset. Imagine a simple virtual cube positioned to be sitting on a real table. As the user walks around, this cube should remain in place, regardless from which perspective the user looks at it.

In more detail, correct anchoring of virtual objects to reality depends on the following points:

  1. In order to create correctly aligned content in a mixed reality experience, accurate knowledge of the user’s head pose is essential. We calculate the head pose using LPVR-CAD.
  2. The field of view provided by the cameras and the natural field of view of the human eyes are very different. Appropriate calibration is needed to compensate for this effect. The illustration below shows this inherent problem of video pass-through very clearly. Varjo’s HMDs are factory calibrated to minimize the impact of this effect on the user experience.

To get a better idea how a correct optical see-through (OST) calibration influences mixed reality performance we recommend playing with the camera configuration options in Varjo Lab Tools.

– Image credit: Varjo mixed reality documentation

Functional Testing of MR Performance

Our LPVR solution must at the very least achieve a precision that is satisfactory for our users’ typical applications. Therefore we decided to do a series of experiments to evaluate the precision of our system for mixed reality experiences and how it compares with SteamVR Lighthouse tracking.

We looked at the following configurations to make our evaluation:

# HMD Engine Tracking system Varjo markers
1 Varjo XR-3 Unreal 5.2 LPVR-CAD No
2 Varjo XR-3 Unreal 5.2 Lighthouse No
3 Varjo XR-3 Unreal 5.2 LPVR-CAD Yes
4 Varjo XR-3 Unreal 5.2 Lighthouse Yes

1 – LPVR Tracking without Varjo Markers

In this scenario we fixed a simple virtual cube on a table top in a defined position and orientation. The cube is the only virtual object in this scenario, everything else is the passed-through live video feed from the HMD cameras. We are tracking the HMD using LPVR-CAD in connection with an ART Smarttrack 3. No markers are used to stabilize the pose of the cube.

Important note: The tracking performance for the mixed reality use case can significantly change if the marker target that is attached to the HMD is not correctly adjusted. Please refer to the LPVR-CAD documentation or contact us for further support.

2 – Lighthouse Tracking without Varjo Markers

This scenario is in its basic setup identical to scenario #1, except that we’re using Lighthouse tracking to find the pose of the HMD.

3 – LPVR Tracking with Varjo Markers

Regardless how well the tracking of the HMD itself is working, as long as distortions of the environment as seen via the video pass-through feed aren’t perfectly compensated, there will be a discrepancy of where objects are displayed in reality and in virtual view space. As there are limits to the precision of such an optical see-through calibration (OST), another way to compensate for its effect is to get additional information about the environment directly from the video feed and align objects to it.

Such a tool are Varjo markers, ie. QR codes that are placed in the image. Using image analysis, virtual objects can be fixed to such QR codes and therefore automatically realigned to teh video feed as the user moves around. The video below shows the result of this scenario.

4 – Lighthouse Tracking with Varjo Markers

In our final test scenario we did the same experiment as in scenario 3, just with Lighthouse tracking instead of LPVR tracking.

Conclusion

See a table with our preliminary findings below:

Test Scenario LPVR Tracking Lighthouse Tracking
Without Varjo Markers 2 2.5
With Varjo Markers 1.5 1.5

– Approximate displacement error on horizontal plane (in cm)

Using the same room setup and test scene, the mixed reality accuracy of LPVR-CAD and Lighthouse tracking is similar. With both tracking systems slight shifts of 1-2cm depending on the head movement can be observed. A way to further reduce this residual drift is to use Varjo markers that further align virtual objects with the video feed from the pass-through cameras. Good results with LPVR tracking require precise adjustment of the optical target attached to the headset.

Note that our method of estimating the displacement error is rather qualitative than quantitative. With this post we made a general comparison of LPVR and Lighthouse tracking, with and without Varjo markers. A more quantitatively accurate evaluation will follow.

For customers wanting to reduce as much drift as possible, we recommenced the use of markers and optical tracking. There may be different results using Varjo XR-4 and other variations on the tracking environment or displayed content, which could warrant further testing in the future.

About Marc Keen

Marc is a techno-artistic hybrid making essential contributions to LP-Research's products by applying strict testing regimen. Besides making sure that product quality always comes first, Marc also works on UI / UX concepts for LPVIZ.