Recently, Tesla's Autopilot autonomous driving function has frequently made mistakes, and even caused an accident last month in which a Model S collided with a large trailer, resulting in the tragic death of the Model S driver on the spot. Some readers or Tesla owners may think that since Autopilot is defined as an automatic driving function, it will be like what is said on TV or in articles, that we can completely get rid of the steering wheel and hand over all operations to Autopilot. In fact, if you really think so, it is very dangerous. First of all, the Autopilot function is still in the testing stage, and all functions are constantly being improved. Therefore, since it is a test system, there is a possibility of problems or failures; secondly, since Autopilot is still a beta system, Tesla also emphasizes that even if this function is turned on, the driver cannot take his hands off the steering wheel and cannot completely hand over the driving rights to the system. He still has to focus on the road ahead. If you still don't care about the above suggestions, then you might as well take a look at some recent traffic accident cases involving Tesla owners. In order to deepen the educational significance, we also attached a few cases of Google's driverless car accidents to let everyone know that autonomous driving or driverless driving is still in a very early stage, so even if your car already has similar functions, you must not take it lightly. Tesla Model S rear-end crash kills driver Let’s talk about this most watched accident first. On May 7, a Tesla Model S with Autopilot turned on collided with a tractor trailer on a highway in central Florida, USA, causing the driver in the car to unfortunately die in the accident. In the analysis of the cause of the accident, according to the current information, it was mainly because the Model S was driving on a two-way road with a central divider, and the automatic driving was in the on-mode, when a trailer crossed the road in a perpendicular direction to the Model S. Under strong sunlight, the driver and the automatic driving failed to notice the white body of the trailer, and therefore failed to activate the brake system in time, resulting in a rear-end collision and the death of the driver. Model X rolls over Just a few days after the Model S rear-end collision was exposed, another Model X with the Autopilot function turned on also caused an accident due to a system crash. This Model X hit the roadside guardrail and caused a rollover accident. Fortunately, the driver of the Model X was not seriously injured. According to the driver of the Model X, he had just turned on the Autopilot function, and the vehicle suddenly hit the right road guardrail when leaving the exit, then rebounded to the central isolation belt of the concrete structure and overturned. There was a driver and a passenger in the car at the time, fortunately, both of them were not seriously injured. The fragments of the Model X also hit another car, but fortunately the driver of the car was not injured either. Tesla then analyzed the Model X's in-car computer data but has not yet released any results. Model X hits guardrail at high speed Last Saturday, a Model X owner in Montana turned on the Autopilot function and crashed his car into a roadside guardrail. From the pictures, the passenger side of the car was severely damaged, so it was a relatively serious traffic accident. However, the two passengers in the car were very safe and did not suffer any injuries. According to a user from the Tesla forum, the Model X was driving at a speed of 88 kilometers per hour, and then the car suddenly hit the guardrail on the side of the road. It is reported that the two passengers did not know why they ran out of the road and hit the guardrail after turning on Autopilot, and they hope to find out the cause of the accident. Model X suddenly accelerated and crashed into a building In June this year, a Model X owner and his wife went to a shopping mall. When his wife tried to park, the Model X suddenly accelerated automatically without warning and crashed into a building in front. Although the airbags in the car were deployed after the accident, his wife still had electric burn marks on her arm. Fortunately, the vehicle turned left in time, otherwise it would have rushed into the store in front, causing more casualties and losses. Whether the Autopilot function was turned on when the accident happened has also attracted attention from the outside world. However, Tesla later analyzed the car's data and stated that the Autopilot function was not turned on at the time, and the accident occurred because the vehicle's accelerator pedal was pressed. Google self-driving car hits bus In addition to Tesla's Autopilot, Google's driverless cars, which have already started testing, have also had various accidents. In March this year, a Lexus RX450h driverless car that Google was testing on the road crashed into a bus while changing lanes to avoid sandbags on the road. The bus was left with long scratches, and the radar component of the Google driverless car was eventually squeezed into the door of the bus, causing damage to a windshield on the bus door. Fortunately, the collision was not serious and the passengers in the bus were not affected too much. However, it should be noted that the bus driver in this accident was eating a sandwich while driving. At the time of the collision, the speed of Google's driverless car was 2 miles per hour, while the speed of the bus was 15 miles per hour. However, despite the low-speed collision, the left front side of Google's driverless test vehicle was dented, the left front tire burst, and the left radar used to perceive the surrounding environment was knocked off. Google said its driverless technology determined that the bus would gradually slow down, but the driver did not, which caused the collision. However, according to the local transportation bureau's appraisal conclusion, the accident was an active collision by the driverless car, and based on the on-site investigation results, the bus driver was not responsible for the accident. Google self-driving car worker injured This time, let's go back to July last year, when Google's self-driving car was involved in a rear-end collision, and the three Google employees in the car suffered minor skin injuries, while the driver of the car behind complained of neck and back pain. According to the director of Google's self-driving project, the self-driving test car was waiting for a red light at an intersection in Mountain View when it encountered a traffic jam on the right lane. At this time, another car drove up from behind but failed to brake, and rear-ended the Google car at 27 km/h. The cause of rear-end collisions is often because the driver of the car behind is distracted, possibly checking his phone, so the main person responsible for this accident is the human driver behind. In fact, in general, the probability of accidents with autonomous driving or unmanned driving technology is relatively low compared to human drivers. Although Tesla Autopilot has encountered many accidents recently, Elon Musk is still full of confidence. He said that if the Autopilot function is widely used in the future, there will be more than 1 million fatal traffic accidents every year in the world, and about 500,000 people will survive. According to Google's analysis of driverless car accidents, Elon Musk's statement is also confirmed. In the accident statistics of Google's driverless cars, there are only 0.6 accidents per 100,000 miles, so the prospect of driverless cars is still very broad. However, since we cannot completely deny driverless driving, there are still many problems that need to be solved for the current development of autonomous driving or driverless driving, including road construction, weather, hackers, pedestrians, and how to deal with other human drivers' unruly driving behaviors. These are all problems that autonomous driving functions need to face. Since humans are developing driverless technology with the goal of achieving the vision of "zero accidents" in the future, as long as an accident occurs, it means that the technology is not yet mature and there is still room for improvement. After all, no one wants to joke with their lives. The development of driverless technology is irreversible. During this development process, in addition to the continuous efforts of manufacturers to improve and perfect laws and regulations, it is hoped that ordinary drivers and pedestrians will also "behave themselves" like the rules of the driverless system, so that tragedies with casualties will not happen again. As a winner of Toutiao's Qingyun Plan and Baijiahao's Bai+ Plan, the 2019 Baidu Digital Author of the Year, the Baijiahao's Most Popular Author in the Technology Field, the 2019 Sogou Technology and Culture Author, and the 2021 Baijiahao Quarterly Influential Creator, he has won many awards, including the 2013 Sohu Best Industry Media Person, the 2015 China New Media Entrepreneurship Competition Beijing Third Place, the 2015 Guangmang Experience Award, the 2015 China New Media Entrepreneurship Competition Finals Third Place, and the 2018 Baidu Dynamic Annual Powerful Celebrity. |
<<: These electric supercars accelerate from 0 to 100 km/h in less than 4 seconds
>>: In addition to Google, they are also developing driverless technology
Who are the most loyal users in the mobile health...
However, today's article wants to do one thin...
Today, I will share with you the introductory kno...
This article was reviewed by Zhao Wei, MD, associ...
May 11 is the annual World Obesity Day. The homon...
We often say that before promoting a new product,...
There is a kind of exercise that can strengthen t...
It is understood that the southwest and central C...
In the workplace, we often need to write "pl...
With the opening of various functions of mini pro...
At the 2017 WeChat Open Class Pro Conference, the...
[[155128]] According to a person close to Alibaba...
Of course, we can definitely say that this is an ...