The application uses a system called Simultaneous Localization and Mapping (SLAM) in real time to project and render the tank from all perspectives in new and unknown environments.
The demo video below shows how the demonstrator is able to walk around the tank and see it from different angles without using a special environment or background for the app to work with.
According to Ogment, SLAM “is typically used by robots and autonomous vehicles to build up a map within an unknown environment… while at the same time keeping track of their current location.” Applying the system to AR games allows users to drop digital environment elements into any space.
Ogment is billing the tank as having “x-ray vision”—that is, when the tank spins its cannon and fires a shot at a detected surface, the AR application will display a “hole” showing “what’s behind” that surface. In the demo video, a hole blasted in the tablecloth where Will Wright and Bruce Sterling are sitting shows bottles of booze and pantless legs (though if this were real and serious AR, it would show two bloody stumps instead).
Oriel Bergig, vice president of research and development at Ogment, told Ars that other pre-loaded X-ray vision themes will include “scenery” and “urban.” But the X-ray vision is just a distraction; more important is the game’s ability to render a tank in real time and space without prior preparation, and to detect targetable surfaces in its environment