もっと詳しく

The National Highway Traffic Safety Administration has released its first batch of data for semi-autonomous driving technology. As The New York Timesexplains, the agency linked 392 crashes to partial self-driving and driver assistance systems in the 10 months between July 1st, 2021 and May 15th, 2022. About 70 percent of those, 273, were Tesla vehicles using Autopilot or the Full Self-Driving beta. Honda cars were tied to 90 incidents, while Subaru models were involved in 10. Other makes, including Ford, GM, VW and Toyota, had five incidents or less.

Out of the 98 crashes with injury reports, 11 resulted in serious injuries. Five of the Tesla incidents were fatal. The 130 total crashes for self-driving systems included 108 with other cars and 11 with "vulnerable" road users like cyclists and pedestrians.

The findings are a response to a Standing General Order requiring that car manufacturers and operators report crashes to the NHTSA when Level 2 or higher autonomy is active at the time of the incident. The transportation agency hopes the info will support a "more data-driven approach" to safely rolling out self-driving tech, including regulation and education.

As administration head Steven Cliff told the press, the data doesn't offer any conclusions by itself. There are roughly 830,000 Autopilot-equipped Tesla vehicles in the US, for instance — they may dominate incident reports simply because they're some of the most common semi-autonomous cars. Ford, GM and others have equivalents, but they're frequently optional (Autopilot is standard on Teslas) and simply rarer on the road.

The statistics nonetheless draw attention to multiple investigations into crashes like these, including from the National Transportation Safety Board. One Tesla driver in California is also facing felony charges from state prosecutors over a deadly 2019 incident. While companies like Tesla have long argued that their driver assists are safer than exclusively human control, the NHTSA, NTSB and other bodies clearly want a better understanding of real-world safety issues before they embrace autonomous driving in earnest.