Security News > 2020 > February > Attacking Driverless Cars with Projected Images

Attacking Driverless Cars with Projected Images
2020-02-03 12:24

Abstract: The absence of deployed vehicular communication systems, which prevents the advanced driving assistance systems and autopilots of semi/fully autonomous cars to validate their virtual perception regarding the physical environment surrounding the car with a third party, has been exploited in various attacks suggested by researchers.

Since the application of these attacks comes with a cost, the delicate exposure vs. application balance has held, and attacks of this kind have not yet been encountered in the wild.

We show how attackers can exploit this perceptual challenge to apply phantom attacks and change the abovementioned balance, without the need to physically approach the attack scene, by projecting a phantom via a drone equipped with a portable projector or by presenting a phantom on a hacked digital billboard that faces the Internet and is located near roads.

We show that the car industry has not considered this type of attack by demonstrating the attack on today's most advanced ADAS and autopilot technologies: Mobileye 630 PRO and the Tesla Model X, HW 2.5; our experiments show that when presented with various phantoms, a car's ADAS or autopilot considers the phantoms as real objects, causing these systems to trigger the brakes, steer into the lane of oncoming traffic, and issue notifications about fake road signs.

In order to mitigate this attack, we present a model that analyzes a detected object's context, surface, and reflected light, which is capable of detecting phantoms with 0.99 AUC. Finally, we explain why the deployment of vehicular communication systems might reduce attackers' opportunities to apply phantom attacks but won't eliminate them.


News URL

https://www.schneier.com/blog/archives/2020/02/attacking_drive.html