Tesla has frequent accidents and the future geometry of autonomous driving

  Today's viewpoint

  ◎Intern reporter Zhang Jiaxin

  Recently, incidents such as rights protection at the Shanghai Auto Show and the death of one person after a Tesla hit a concrete wall in Guangzhou have pushed Tesla to the forefront of public opinion for a while.

On the 27th, the topic "Jiangsu-Tesla rushed into the bun shop" hit the hot search on Weibo again.

Coincidentally, in the United States recently, a Tesla car crash occurred, resulting in the tragedy of two deaths.

  Regardless of the reason, apart from the pain, several incidents have caused people to question and argue about today's autonomous driving technology: how safe is it and how much attention does it require from drivers?

Can you still buy self-driving cars in the future?

  An article recently published in the Australian "Dialogue" magazine argues that although the technology required to achieve higher levels of automation is developing rapidly, producing a car that can safely and legally drive the entire journey without the driver’s attention is still a huge challenge.

Before they can safely enter the market, three key obstacles must be overcome: technology, regulations, and public acceptance.

  How to understand "autonomous driving" technology

  First of all, we must understand what "autonomous driving" technology is.

There are 6 levels of technology for autonomous vehicles, from level 0 "no automation", that is, a traditional vehicle without autonomous driving functions, to level 5 "fully automated," a vehicle that can independently accomplish anything that a human driver can accomplish. .

  Most autonomous vehicle operations on the market today require manual intervention.

For example, level 1 vehicles "driver assistance" to keep the vehicle in the lane or control its speed, or level 2 vehicles "partially automated", the driver must always steer and control the speed.

  Level 3 vehicles have more autonomy, and the car can make some decisions on its own.

But if the system cannot be driven, the driver must still be vigilant and control the vehicle.

  Level 4 and Level 5 vehicles have higher levels of automation, and human drivers may not necessarily participate in driving tasks.

These two levels of vehicles can steer, brake, accelerate, monitor vehicles and roads, and respond to events to determine when to change lanes and turn.

  However, the place and time for Level 4 vehicles are limited.

Level 5 represents a true self-driving car, which can drive at any time and anywhere, similar to human driving.

However, the transition from level 4 to level 5 is more difficult than the transition between other levels and may take several years to achieve.

  The machine should "learn" a lot of actual driving scenarios

  Autonomous driving software is a key feature that distinguishes highly automated vehicles from other vehicles.

The software is based on machine learning algorithms and deep learning neural networks, which includes millions of virtual neurons that simulate the human brain.

  Neural networks need to be trained to learn to recognize and classify objects by using millions of examples of videos and images from actual driving conditions.

The more diverse and representative the data, the better the neural network can recognize and respond to different situations.

Training a neural network is a bit like holding a child's hand when crossing a street, teaching them to learn patiently through continuous experience and repeated training.

  Although these algorithms can detect and classify objects very accurately, neural networks still cannot imitate the complexity of actual driving.

Self-driving cars not only need to detect and recognize people and other objects, but they must also interact with, understand and react to the behavior of these objects.

They also need to know what to do in unfamiliar road conditions.

If there are not a large number of examples for all possible driving scenarios, deep learning and training to deal with emergencies will become relatively more difficult.

  Vehicles should undergo rigorous evaluation before they are on the road

  Policymakers and regulators around the world are striving to keep up with the development of autonomous vehicle technology.

Today, the industry is still largely self-disciplined, especially in determining whether the technology is sufficiently safe and suitable for open roads.

An article in Dui Hua said that regulators have largely failed to provide standards in these areas.

  Under realistic conditions, it is necessary to test the performance of autonomous driving software. The first thing that should be done is a comprehensive safety test and evaluation.

Regulators should develop a standard test plan to allow companies to benchmark their algorithms against standard data sets before their vehicles are allowed to go on the road.

  In Australia, current laws do not support the safe commercial deployment and operation of autonomous vehicles.

The Australian National Transport Commission is leading a nationwide reform to support the innovation and safety of autonomous driving technology so that Australians can enjoy the benefits of this technology.

  The article believes that a gradual certification method is needed for the rules and regulations of autonomous driving technology.

This approach should require that the autonomous driving system be evaluated first in simulation and then evaluated in a controlled real environment.

Only when vehicles pass specific benchmark tests can regulators allow them to drive on open roads.

  Public acceptance is the key to technology trust

  The public also needs to be involved in the deployment and adoption decisions of autonomous vehicles.

If self-driving technology is not regulated to ensure public safety, then public trust will be destroyed, and this is the real risk.

Lack of trust not only affects those who want to buy self-driving cars, but also those who share the same road with these people.

  Finally, the recent incidents should become a catalyst for the regulatory agencies and industry to establish a strong safety culture and guide the innovation of autonomous driving technology. Otherwise, the future of autonomous vehicles may be bumpy and difficult to go far.