It is all down to how sunlight bounces off surfaces differently at various times of year, as the relative position of the sun changes.
“As the sun passes over the scene, different pixels will light up at different times,” says Austin Abrams at Washington University in St Louis, Missouri. The software he and his colleagues created uses a GPS reading, which can be taken separately, and time-stamp data to calculate the position of the sun in relation to a webcam image.
By watching how reflections change over the course of a few months, it can figure out the orientation of all the surfaces in the scene.
Google generates the 3D models that populate Google Earth by using a fleet of camera-equipped planes that fly over cities snapping photos. The program also lets users create their own 3D models of local buildings and upload them. Those models are kept simple to ensure that Google Earth runs smoothly.
The researchers’ models, by contrast, capture minute detail. “In some cases, we can even capture the 3D structure of individual shingles on a rooftop,” says Abrams.