Researchers fool Tesla Model S AutoPilot System
The debate surrounding the reliability of autonomous vehicles has been going on for a long time. If that was not enough, the potential danger associated with hackers successfully influencing the underlying software of a self-driving car has cropped up.
Researchers from the University of South Carolina, Zhejiang University in China, and Qihoo 360 (a Chinese security firm) has found a potential flaw in Tesla’s AutoPilot semi-autonomous driving system. The team was able to exploit weaknesses in the system that lead researcher and USC Professor Wenyuan Xu said “highly motivated people” could use “to cause personal damage or property damage.”
The group will report its findings at the DEFCON Hacking Conference in Las Vegas, which used off-the-shelf products to carry out the hack.
According
to Wired, “Tesla’s autopilot detects the car’s surroundings three
different ways: with radar, ultrasonic sensors, and cameras. The
researchers attacked all of them, and found that only their radar
attacks might have the potential to cause a high-speed collision. They
used two pieces of radio equipment—a $90,000 signal generator from
Keysight Technologies and a VDI frequency multiplier costing several
hundred dollars more—to precisely jam the radio signals that the Tesla’s
radar sensor, located under its front grill, bounces off of objects to
determine their position. The researchers placed the equipment on a cart
in front of the Tesla to simulate another vehicle.”
“When there’s jamming, the ‘car’ disappears, and there’s no warning,” Prof. Xu says.The jamming of the Model S ultrasonic sensor, deployed in Tesla’s for discovering nearby objects for actions such as self-parking, saw sound signals sent from a DIY ultrasonic jammer based on an Arduino board. They would swamp the system to subdue the real echoes bouncing off an object to remove them from the autopilot’s vision. If they had let it, the Tesla would collide with the obstacles.
However, the Tesla’s camera systems were the most resistant to attacks. In order to blind the cameras, the researchers shined lasers and LEDs at them. Also, Xu’s team was able to kill a few pixels on the camera sensors. But AutoPilot simply blacked out and warned its driver to take the wheel when they tried to block the camera.
Based on the comments of Prof. Xu, it’s more significant to compel Tesla to add protections to AutoPilot instead of simply breaking the system.
Appreciating Prof. Xu and his team’s work, Tesla told Wired in a statement that “We have reviewed these results with Wenyuan’s team and have thus far not been able to reproduce any real-world cases that pose risk to Tesla drivers.”
Source: Wired
No comments:
Post a Comment