もっと詳しく

An investigation into Tesla has caused an uproar on social media. Some netizens even described the findings of the survey as “epic”:

This investigation comes from the National Highway Safety Administration (NHTSA), and in the exposed documents, there is a sentence that stands out: Autopilot aborted vehicle control less than one second prior to the first impact. (Autopilot aborted control of the vehicle less than 1 second before impact. )

This discovery by NHTSA is like dropping a blockbuster on the public. Because it’s like “the robot hands the steering wheel to you before the car hits you”…

Then many netizens began to question: With such a design, Tesla can deny that the accident was because of Autopilot?

Ah, this…

Put the steering wheel in your hands less than a second before the accident

The NHTSA investigation into Tesla has been underway since last August.

The direct reason for the investigation is that more than one Tesla broke into the scene of an existing accident with the automatic driving system Autopilot turned on, and hit an ambulance, a police car or an accident vehicle that had already been parked on the roadside.

I don’t know if I don’t check it, I’m shocked when I check it.

They found that in 16 of these crashes, the majority of the vehicles’ Autopilot had Forward Collision Warning (FCW) activated prior to the collision. The automatic emergency braking (AEB) of about half of the vehicles subsequently also actively intervened. But in the end they all failed (one person died unfortunately).

Horribly, on average, in all 16 crashes, Autopilot aborted control less than a second before the actual impact of the car. And this, there is not enough time for human drivers to take over.

Video of the accident showed that the human driver basically noticed the existing accident scene ahead for 8 seconds before the collision.

But court data from 11 of those crashes showed that no human driver took evasive action in the 2-5 seconds before the collision, even though they all kept their hands on the steering wheel, as Autopilot asked.

Perhaps most drivers still “believed” in Autopilot at the time—nine of them had drivers who didn’t respond to the system’s visual or audible alerts in the last minute before the collision.

But there were also four cars that didn’t give any alerts at all.

Now, in order to further understand the safety of Autopilot and related systems (and to what extent it can undermine human driver oversight and exacerbate risks), NHTSA has decided to escalate this initial investigation into an engineering analysis (EA).

And expand the vehicles involved to all four Tesla models: Model S, Model X, Model 3 and Model Y, for a total of 830,000 vehicles.

And as soon as the results of this investigation came to light, victims began to speak out.

One user said that as early as 2016, her Model S also crashed into a parked car while changing lanes. But Tesla told her it was her fault because she hit the brakes before the crash, causing Autopilot to spin out of control. But at this time the system raised an alarm, the woman said. It means that everything I do is wrong. I don’t hit the brakes and I don’t pay attention to the alarm.

However, Musk did not respond to the matter, but just happened to send a tweet 6 hours after the netizens broke the news, throwing out an NHTSA investigation report on Tesla as early as 2018.

The report said Tesla’s Model S (produced in 2014) and Model X (produced in 2015) have been shown to have the lowest probability of injury after an accident of all vehicles through all previous NHTSA testing.

Now they found out that the new Model 3 (produced in 2018) actually replaced the Model S and Model X in the first place.

△ Tesla also took the opportunity to make a wave of publicity

This is a response to the incident of “the car was destroyed but no one was injured” in the Tesla car accident in Shanghai a few days ago.

Interestingly, just when everyone was decrying how unreliable Tesla’s autopilot was, someone stood up and said that everyone was “stunned” for black Tesla.

In fact, Tesla interrupted Autopilot 1 second before the accident and did not throw the blame on humans. When they counted Tesla accidents, as long as Autopilot was working within 5 seconds of the collision, they would count the responsibility on Autopilot. .

Later, Tesla officials also came out to confirm this rule.

However, although humans don’t have to take the blame, the “saucy operation” of cutting off the autopilot one second before the accident still cannot change the fact that the human driver has no time to take over the steering wheel.

Is Autopilot reliable?

After reading NHTSA’s findings, let’s look back at Autopilot, the focus of this public opinion.

Autopilot is a set of advanced driver assistance systems (ADAS) from Tesla, which is at L2 in the automatic driving level proposed by the International Society of Automotive Engineers (SAE). (SAE divides autonomous driving into six levels, from L0 to L5)

According to Tesla’s official description, Autopilot currently has functions including automatic assisted steering, acceleration and braking in the lane, automatic parking, and “summoning” the car from the garage or parking space.

So does this mean that the driver can be fully “managed”? it’s not true.

Tesla’s Autopilot at the current stage can only play an “assistance” role, not fully autonomous driving. It also requires the driver to “actively” and “actively” supervise Autopilot.

But in the official introduction, Tesla also gave a little description of the ability of “full self-driving”: all new Teslas have the hardware required for full self-driving in almost all situations in the future. The ability to travel both short and long distances without the need for a driver.

But for Autopilot to achieve these goals, it is important to be far superior to humans in terms of safety. In this regard, Tesla says it has proven it in billions of miles of experiments. And Musk has spoken out about Tesla’s safety more than once: Tesla’s fully automatic driving safety level is much higher than that of ordinary drivers.

But is this the case?

Regardless of the evaluations given by Tesla and Musk, from a practical point of view, Tesla’s Autopilot is controversial in terms of safety.

For example, the common “ghost brake” incident has pushed it to the forefront of public opinion again and again, and it is also one of the main reasons for NHTSA to launch this investigation.

“Ghost braking” means that when the driver turns on the Tesla Autopilot assisted driving function, even if the vehicle has no obstacles in front of it or does not collide with the vehicle in front, the Tesla vehicle will perform an unnecessary emergency brake.

This poses a huge safety hazard to drivers and other vehicles on the road.

Not only that, if you take a closer look at the hot events related to Tesla, it is not difficult to find that many of them are related to the safety of Autopilot:

So what does NHTSA think about this?

In the document, NHTSA reminded that there are currently no fully autonomous vehicles on the market, “every vehicle requires the driver to be in control at all times, and all state laws hold the driver responsible for the operation of his vehicle. .”

As for how this investigation will affect Tesla in the future, according to NHTSA: If there are safety-related defects, it has the right to issue a “recall request” letter to the manufacturer.

Finally, do a little research – do you trust Autopilot?

.
[related_posts_by_tax taxonomies=”post_tag”]

The post Tesla is caught in a huge doubt: Autopilot automatically exits 1 second before the accident – ​​yqqlm appeared first on Gamingsym.