Bitforge at the Oculus Mobile VR Jam: Spatial VR Viewer

The Oculus Mobile VR Jam 2015, an international competition to develop new apps for the Samsung Gear VR (virtual reality) glasses, is currently being judged. And we are part of it.
Click here for the video of our submission

Virtual reality glasses are an evolving technology whose practical use will emerge in the coming years. Comparable to mobile telephony before the breakthrough of smartphones, we imagine that a user-friendly VR environment will emerge in the near future and provide revolutionary access to digital data.

For this competition, with its strong focus on innovation, we decided to explore and expand the application possibilities of the given hardware – Gear VR is a pair of video glasses based on a smartphone.

A central problem with Gear VR is head tracking: the glasses can detect the user’s line of vision accurately and quickly, but not shifts in position, such as lateral movements of the head. For a better VR experience, the virtual viewpoint should mimic the real movements as closely as possible. Our motivation was to build such an experience for the Gear VR with existing technology.

We developed an application for viewing virtual 3D objects in a kind of exhibition situation. One area of application for this is in architecture, where a model can be viewed on a reduced scale, like a classic, built architectural model. In our example, we use such a model based on CAD data of an ASA AG building. The user physically moves around a marker, for example on a central pedestal, in order to view the virtual building model from all sides with the Gear VR. From this real movement around the marker, we create a relative (camera) position to the virtual object.

Since we want the user to be able to move sideways as well as perpendicular to the object being viewed, we need reliable positioning for the virtual space and fast reaction times in different viewing angles and situations. To achieve this, we merged two different positioning techniques:
We continuously calculate displacements in 3D from the acceleration sensors and gyroscopes of the Gear glasses (or the smartphone used in them).

We use Vuforia Marker Tracking from Qualcomm to know where the viewer is in the room relative to the central object. A (printed) image as a visual marker is placed as a placeholder at the location where the virtual object should appear; whenever the smartphone camera then “sees” the marker, the spatial position is calculated from it.
We merge the outputs of these two techniques to produce a faster and more stable result than either could produce on its own. This allows the viewer to immerse themselves in a virtual reality in which they can move sideways around the object, as would otherwise be impossible with Gear VR.

You can now see the model in 3 dimensions, but of course you can’t touch it. However, to interact with the house, you can use the glasses’ built-in touchpad to lift the floors of the model individually to gain an insight.

We are looking forward to further developments in the field of virtual reality and want to be actively involved right from the start. The VR Jam competition placings will be announced at the beginning of June 2015.
For us, this work is primarily a feasibility study on which products can be developed.