Seeing Beneath Snow-Covered Roads

Ordinarily, self-driving cars use cameras and other sensors to “see” where they are on the road. However, what happens if the road markings are covered in snow? Well, MIT has developed a system that lets vehicles look beneath the asphalt instead.

The experimental technology incorporates what’s known as “localizing ground-penetrating radar” (LGPR), in which electromagnetic pulses are emitted into the ground and reflected back up by underground objects.

In the new system, an LGPR rig is initially used to map a stretch of road, identifying the unique combinations of soil, rocks and roots beneath the surface in a series of locations. When an LGPR-equipped car subsequently drives that same road, it continuously compares its own readings with those that were previously recorded for that area. In this way, it’s able to figure out where it is relative to the rest of the road, without using any above-ground visual references.

In a closed-course test of the system, it could accurately determine its location on a snow-covered road, within a margin of error of only about 1 inch (25 mm).

That said, rain can also pose a problem for traditional cameras and LiDAR scanners, as it obscures road markings as it falls through the air between them and the car. When the system was tested in rainy conditions, it actually had a higher margin of error than with the snow, at an average of 5.5 inches (140 mm). This was because as the rainwater soaked into the ground, it subtly altered the soil conditions from those that had been mapped previously.

Although the setup has so far only been tested at slow driving speeds on country roads, the researchers believe that it should also be usable at high speeds, on highways. It isn’t intended to replace a car’s existing visual sensors, however – just to augment them. And while the LGPR map files themselves are considerably smaller than the currently-used above-ground 2D maps, the equipment itself is still quite big and bulky at this point, measuring 6 ft wide (1.8 m).

A paper on the research, which is being led by Prof. Daniela Rus and PhD student Teddy Ort, will be published later this month in the journal IEEE Robotics and Automation Letters.

MIT car-guiding tech sees beneath snow-covered roads [New Atlas]

(Visited 27 times, 1 visits today)