Tesla's self-driving car test: It really can go "blind"

Tesla's self-driving car test: It really can go "blind"
Researchers recently tested Tesla in the laboratory, and the results showed that the sensors of the autopilot function can be "fooled" and "blind" under certain conditions, posing certain safety risks. However, Tesla responded that the test method cannot be applied in the real world and the driver's driving safety will not be affected. Since the first autopilot death case occurred on May 7 this year, Tesla's Autopilot semi-autonomous driving function has been in the spotlight. In addition to the official investigation by the United States, a large number of researchers have tried to find defects in Tesla's autopilot function. According to Auto Fashion, researchers from the University of South Carolina, Zhejiang University and technicians from Qihoo 360 finally succeeded in tricking the sensors of Tesla's autopilot technology. It is reported that the research team announced the details of the relevant tests at the Defcon hacker conference . The test showed that in certain situations, they can completely trick Tesla's autopilot sensors and other functions, making the autopilot function "blind", which can cause a crash at high speeds. However, the research team also admitted that their method of disrupting and fooling Tesla's autopilot system may not be feasible in the real world. First, the team fooled the sensors by testing them on a stationary car, and second, some of the equipment used in the test was very expensive. However, using some equipment such as lights, radios and sonic transmitters that anyone can buy, the researchers were able to fool Tesla's sensors, such as when the sensors thought there was an obstacle ahead, but there was actually no obstacle. The researchers tried to attack the ultrasonic and radar sensors and cameras of the Tesla Model S, but ultimately found that there might be only one way to make the attack lead to a high-speed crash. The researchers reportedly used a $90,000 signal generator and VDI (virtual device interface) multiplier to disrupt Tesla's radar system. When these devices are turned on or placed directly in front of a Tesla car, Tesla's autopilot function sensors will produce a "blind" phenomenon. The researchers said that using this method to interfere with the autopilot function at high speeds is feasible, but it takes some effort. However, another technician familiar with the research project and Tesla's autopilot function expressed a different view, saying that the experimental team's method is not feasible, and in the experiment, the research team disrupted the camera by placing the light source only a few feet in front of the camera. In addition, the research team also found a way to interfere with short-wave sensors, which are essential for Tesla's automatic parking and "Standby Summon" functions. Using Arduino computers that can generate specific voltages and ultrasonic transducers that can convert electrical energy into sound waves, the researchers successfully tricked Tesla's sensors at a distance of several feet, making Tesla think that there was an obstacle ahead, but in fact there was none. What would be the consequences of a malicious attack in this way? ModelS would not be able to park in the selected place. In addition, the researchers used sound-absorbing plastic to wrap some obstacles, and as a result, these obstacles were successfully invisible in front of Tesla's sensors. However, in the real world, bad guys may have very limited means to attack Tesla or other vehicles using these two methods. Finally, the research team confirmed that using lasers or LEDs directly aimed at Tesla's cameras can blind the cameras. Lasers can cause permanent damage to cameras, but in the end, this method may not be able to derail a Tesla in autonomous driving mode. Once other objects disrupt the Model S's camera, the autonomous driving function will be turned off and the vehicle will require the driver to take over the car again. On the other hand, the research team's findings did not bother Tesla. A Tesla spokesperson responded: "Tesla is willing to work with security experts to discover potential weaknesses in our products and solve these problems. We are grateful to Xu and her research team for their hard work and dedicated testing of attack sensors. We have communicated with Xu's team, but we believe that in the real world, it is difficult for us to replicate the situation in the laboratory, and those methods will not pose a safety threat to Tesla drivers." However, Xu and her research team believe that Tesla should consider their research results more seriously. Xu said: "I don't want to convey to the outside world that the sky is falling and you should not use the autopilot function. These attack methods require some skills, but some fanatics are likely to use these methods to cause personal injury or property damage. Overall, we hope that people can learn from it and improve the reliability of sensors. We can't rely on Tesla alone without caring about our own safety."

As a winner of Toutiao's Qingyun Plan and Baijiahao's Bai+ Plan, the 2019 Baidu Digital Author of the Year, the Baijiahao's Most Popular Author in the Technology Field, the 2019 Sogou Technology and Culture Author, and the 2021 Baijiahao Quarterly Influential Creator, he has won many awards, including the 2013 Sohu Best Industry Media Person, the 2015 China New Media Entrepreneurship Competition Beijing Third Place, the 2015 Guangmang Experience Award, the 2015 China New Media Entrepreneurship Competition Finals Third Place, and the 2018 Baidu Dynamic Annual Powerful Celebrity.

<<:  Why are the old domestic appliances from 20 years ago so durable?

>>:  WiseApp: In 2024, Korean users will use generative AI applications 900 million times per month, with ChatGPT leading the way

Recommend

4 common characteristics of 9 popular marketing cases

In the past six months, brand marketing has come ...

The Panoramic Effect: How Seeing Earth from Space Changes You

NASA astronaut Bruce McCandless II "floating...

12 ways to activate your social network

Many communities die soon after they are establis...

From 0 to 1, play with short video promotion and operation on Tik Tok!

Today we will share the following 5 major section...

Here, you can see the whole Sichuan at once

In Ya'an You can see the whole Sichuan at onc...

3 steps to teach you how to operate a good community

Community operation has become the core business ...