Indoor AR navigation

One of the great strengths of augmented reality is that it greatly simplifies the translation of two-dimensional data into three-dimensional space. Not only those who used to have to regularly search for orienteering flags in the forest know that interpreting 2D maps can sometimes be very tedious. It would be infinitely easier to be guided directly to the next control in AR. So it’s no wonder that AR navigation is currently high on the roadmap of many companies.

It’s also no wonder that the tech giants are trying to outdo each other. Google is currently in the lead. Its “Live View” feature has been rolled out for some time and can be used without restrictions.

Augmented Reality Navigation

Apple is working on AR features

It’s no secret that Apple also has augmented reality high on its priority list. So far, Apple’s strength lies primarily on the device side. In most respects, Apple devices have an advantage over Android devices when it comes to room detection. The company has just demonstrated this again with the launch of the new iPad Pro. Its innovative LiDAR scanner is currently the benchmark in terms of room and depth detection:

It is also no longer a secret that Apple has been working on its own wearable in the AR sector for some time. However, as a code review of iOS 14 now shows, Apple is planning even more. It looks like we can expect “Apple AR Shopping”. This is an AR app with the working title “Gobi”. The app will be able to display additional information to consumers in AR while they are in the store. The prerequisite for this is that the device knows exactly where it is currently located in the store. We are talking about indoor AR navigation.

Indoor AR navigation is a big step

So while “classic” AR navigation is already becoming established outdoors, indoor navigation is still a relatively big challenge. GPS is accurate to around five meters with good reception. This is sufficient outdoors in the vast majority of cases.
However, GPS often works much worse inside buildings and is therefore unsuitable for localization. So other approaches are needed.

Overview of current technologies

The question of which approach works best indoors is currently being clarified. Because no one has really established themselves yet. Instead, a large number of companies with very good approaches are vying for supremacy.

An overview of the various technologies currently in use:

Image Marker

AR markers have long been a popular method of displaying AR content. An image is stored in the AR app and the AR content is loaded as soon as this image is recognized. In terms of navigation, this means that as soon as the marker is recognized, the device knows where it is. The navigation is then derived from this.

As things stand today, marker-based navigation is still too imprecise to function reliably. This is because a 3D model of the surroundings is loaded when the marker is scanned and the device calculates exactly where you are in the model. This is done based on the angle at which the smartphone is positioned in relation to the marker. If there are minimal deviations between the placement of the marker in the model and in reality, an error occurs that increases as the navigation continues.

This effect can be solved by additional markers, which are distributed at regular intervals and must be scanned to recalibrate the position. However, user-friendliness suffers as a result.

Beacons

Another approach is orientation using beacons. Beacons are small Bluetooth transmitters that send information to connected devices. A prominent example of the use of beacons for navigation purposes is London Gatwick Airport. There, 2,000 beacons have been installed to enable AR navigation within the airport.

The location accuracy of beacons is reasonably high at an official +/- 3 meters. However, beacons have the disadvantage that they are not cheap to purchase and require maintenance, as the batteries have to be replaced from time to time.

Visual Positioning System (VPS)

Google is going one step further and has already rolled out an AR navigation feature across the board with “Live View” in Google Maps. Live View is based on the “Visual Positioning System” developed by Google. This combines GPS data with the camera feed. GPS is used to determine the approximate location of the user. Exact positioning is then determined by interpreting the surroundings. Google uses the Street View data it has collected itself.

The “SBB AR” app, which is available as a preview app in the Play Store, is based on the same technology. The app makes it possible to navigate Zurich main station using augmented reality and display additional information at certain locations. This means that the technology, which has already been tried and tested outside buildings, also works inside buildings. Zurich main station is a meaningful test environment, as the different floors add to the complexity.

If you like, Google’s VPS is a promising step towards an AR Cloud.

Apple Indoor Maps

Apple is now following a similar path to Google. Like its big competitor, the company also relies on a combination of different localization mechanisms. However, instead of GPS plus camera, Apple relies on a combination of WiFi and the motion sensors in the smartphone.

In theory, it looks like this:

Although competitor Google is already much further ahead, initial tests look promising. There is even a test from Zurich main station as a direct comparison:

But that’s not all: DentReality, Apple’s official technology partner for indoor AR navigation, recently published additional test videos. These show how Apple goes one step further and achieves such precise navigation that Keiichi Matsuda’s concept of “hyper reality” no longer seems quite so unrealistic:

We are following developments with great interest, as we are also constantly experimenting with the topic of AR navigation.