share_log

特斯拉自动驾驶车祸调查结果的背后,Autopilot系统是否被过度神话?

格隆汇 ·  Mar 11, 2020 10:05

Author: Chelsea Yang

Source: Silicon Valley Insights

The results of the investigation into the car crash that killed Tesla have finally been announced. A serious car accident on a highway near Mountain View, California in March 2018 led to the death of 38-year-old Apple software engineer Walter Huang (Walter Huang).

In the accident, Walter Huang drove the Model X into a guardrail on Highway 101 at a speed of 71 miles. The two cars in the back were also rear-ended as a result, causing damage to Tesla's high-voltage battery and causing a fire.

The US National Transportation Safety Board (NTSB) listed three possible causes of Tesla's death: first, Tesla's Autopilot autonomous driving system was defective; second, Wong Wai Lun relied too much on the autonomous driving system, causing him to be distracted by playing with his phone while driving. Third, the California Department of Transportation failed to repair the highway's hardware facilities in a timely manner.

What is particularly noteworthy is that the NTSB denounces Tesla for exaggerating the extent of its autonomous driving. It has only reached the L2 autonomous driving level, yet it has packaged itself as an L5. Also, when the accident occurred, Tesla's forward collision warning system failed to alert the car owner to an approaching obstacle, and its automatic emergency braking system failed to start before the collision.

So what exactly is the Autopilot that was denounced by the NTSB as being L2 level only? What tier of autonomous driving is it in? Why is Tesla so confident and always uses Autopilot?

The Evolution of Autopilot

Is Autopilot an autonomous driving system? The original definition of Autopilot refers to a system that controls a track mounted on an aircraft, ship, or rocket. In this system, humans do not need constant control, and the definition of autonomous driving means that a vehicle can automatically sense the surrounding environment and drive autonomously without human intervention, so judging from the original definition, Autopilot is an auxiliary system rather than an autonomous driving system that Tesla has always promoted in front of the public. That is a future goal that has not yet been achieved.

After August 16, 2016, Tesla also officially changed the introduction on the Chinese website from “Autopilot Autonomous Driving” to “Autopilot Autonomous Driving.” Its functions include automatic assisted steering, acceleration, and braking in the driveway, but it requires active monitoring by the driver, and both hands cannot leave the steering wheel. Once off the steering wheel, Autopilot will issue a warning sound to remind the car owner to release the steering wheel. If the owner ignores the alarm, the car will automatically slow down. Even if the car owner takes over the vehicle, the assisted driving function will be disabled, and the autonomous driving assist function can only be restored the next time the car is parked.

In the autonomous driving classification system announced by the US National Highway Traffic Safety Administration (NHTSA), Tesla has now implemented L2 functions (such as ACC adaptive cruise systems mounted on cars such as Honda), and has partially implemented L3 functions. For example, Autopilot can monitor the surrounding environment, so many people have placed Tesla in the L2.5 echelon.

wm

Autonomous driving classification system. The picture is from the Internet. The copyright belongs to the author

wm

wm

An introduction to autonomous driving assistance systems on Tesla's official website. The copyright belongs to the author

Autopilot upgraded all new cars on October 20, 2016. The software was upgraded to the 8.0 system, and more than 200 functions were upgraded. The hardware system was upgraded from the original Autopilot 1.0 to the current HW3.0, but currently 3.0 has not been fully implemented to car owners, so let's focus on the changes from 1.0 to 2.5.

  • Autopilot 1.0 is based on Mobileye's image recognition technology. The main data comes from the Mobileye camera on the roof of the car. The radar on the front of the car and peripheral radar only provide auxiliary information.

  • Autopilot 2.5, on the other hand, is based on a radar recognition environment. The main data comes from the radar on the vehicle body, while the auxiliary data comes from high-precision maps and neural network data.


wm

The Tesla HW1.0 chip has a lot of blank space. The picture is from the Internet, and the copyright belongs to the author

wm

The Tesla HW2.5 chip has a very tight design. The picture is from the Internet, and the copyright belongs to the author

Continuing to use radar, Tesla's maverick path

The current technology for autonomous driving is divided into two major groups. One is led by Tesla, and the other is led by Waymo, which has just received external investment for the first time and raised US$2.25 billion.

Tesla relies on combining data from radar, cameras, and ultrasonic sensors to build a neural network to perform real-time image recognition and simulate human behavior to make judgments, while Waymo relies on lidar, cameras, and ultrasonic sensors that are more expensive and more accurate than radar to determine what response should be made.

wm

Hardware promotion on Tesla's official website. The picture is from the Internet. The copyright belongs to the author

So why do Waymo and many car manufacturers that develop autonomous driving use lidar (Lidar), while Tesla insists on using radar (Radar)? What did Tesla see on the radar?

The most essential difference between radar and lidar is the difference in wavelength. Radars are millimeter waves, usually 4-12mm, while lidars use laser wavelengths, so they are usually between 900-1500 nm.

Radar mainly uses radio waves. Compared with laser waves, radio waves can easily penetrate objects, so they travel a longer distance. Through reflected waves, the speed of a car can be measured. When the traffic police want to measure the speed of a specific vehicle, they can use a radar gun to measure the speed, so they can also measure the speed when the vehicle is far away. However, due to the special nature of radar images, a very small metal object may be recognized as a wall, so people turn their attention to very expensive lidars.

The lidar emits fast laser pulses, and the transmission distance is short, but it can accurately calculate the distance between the sensor and the obstacle, and can also detect the exact size of the target object. It is extremely penetrating, can pass through extreme weather such as rain, fog, etc., and is less disturbed, and is very similar to what the human eye sees. So lidar was used to draw high-definition maps.

Currently, almost all companies that develop autonomous driving use lidars, but Tesla cars are only equipped with radars and have not yet built lidars. The main reason is cost.

The cost of lidars is too expensive. For example, Velodyne's 16-line lidar, which has already been mass-produced by Waymo, costs 35,000 to 45,000 yuan; the more advanced 32 line requires 400,000 yuan, and the 64 line requires 800,000 yuan. Tesla cars are all sold commercially to the general public, and the high cost will inevitably reduce sales, so Tesla can only use radars at present. When asked if Tesla would use lidar in the future, Musk, who has always been cocky, always veto it.

wm

The picture is from the Internet, copyright belongs to the author

However, it is worth mentioning that Tesla has a data advantage over Waymo. Currently, in the field of autonomous driving, handling unexpected situations, such as children suddenly breaking out of the road, requires large-scale data learning of this type to be better processed, so data on the actual driving of a car on an actual road is very important. Waymo is only commercially available in Phoenix, USA, and has launched a taxi service, and Tesla is already running all over the world. As of 2019, Tesla's driving mileage was 2 billion miles, 10 times that of Waymo's 200 million miles.

The main problem Autopilot is currently facing is excessive reliance on cameras and radars. Both instruments used to detect the surrounding environment have large errors, and no lidar is accurate, but Tesla believes that as long as the neural network is strong enough to simulate human driving behavior, it can achieve autonomous driving even when combined with less accurate environmental monitoring data.

Simply put, when a Tesla driver passes through a very short, very narrow, and very dark tunnel, neither radar nor camera can clearly detect the situation at the entrance of the tunnel, and Autopilot will use previous data trained in a neural network to calculate and judge. For example, if Tesla owners have passed through this place before, Autopilot will learn whether these people passed successfully or unsuccessfully. When the probability of successfully passing reaches a certain percentage, this place will be marked as a place where they can pass safely. This is the mechanism by which Autopilot relies on neural networks to learn human driving behavior, rather than relying on very accurate realistic images to make judgments.

With the advent of Autopilot hardware 3.0, Tesla's algorithm, which has the advantage of massive data, is likely to iterate faster, and the processing of edge situations will also be smoother. However, at present, the capabilities of HW3.0 hardware systems and neural networks equipped with chips independently developed by Tesla are still very limited, and it is still very difficult to improve extremely complex deep neural networks to achieve judgment similar to the human brain.

Tesla's driver monitoring system also needs to be improved

In addition to pointing out the Autopilot issue, the NTSB also proposed at the hearing that Tesla should “cooperate to develop” a driver monitoring system.

In the 2018 accident, car owner Wong Wai Lun relied too much on Tesla's autonomous driving function. After being repeatedly reminded by the system to put both hands on the steering wheel while driving, he still focused on playing games on his phone. The NTSB bluntly stated that if Tesla does not install surveillance devices, the autonomous driving system will still be misused, and similar accidents will occur.

Tesla has always relied on pressure sensors to detect whether car owners place their hands on the steering wheel. As early as 2016, when Tesla's first fatal accident occurred, the NTSB reminded Tesla to develop a more intelligent system to monitor whether car owners are focusing on road and traffic conditions. The NTSB also emphasized that visual surveillance such as cameras should be introduced for better monitoring.

However, Tesla has always been deaf and unresponsive to NTSB's reminders.

Tesla owners are distracted and distracted to do other things while driving. In fact, it is also related to Tesla's early exaggerated propaganda, but now Tesla is focusing on technology research and development, and has not shown enough problem-solving attitude towards occasional car accidents. Of course, if it were to follow the NTSB's recommendations, Tesla's production costs would increase again.

So, what do readers think about the Tesla crash? Can Tesla really build cars at low cost while ensuring safety?

The translation is provided by third-party software.


The above content is for informational or educational purposes only and does not constitute any investment advice related to Futu. Although we strive to ensure the truthfulness, accuracy, and originality of all such content, we cannot guarantee it.
    Write a comment