Tech

/

Is Tesla’s Driving Tech Getting a Free Pass?

The idea of taking the benefits of car travel and automating it to allow the driver to essentially become a passenger has featured in numerous science fiction movies, television shows, and books. As with many technologies, numerous companies are currently trying to advance technology to levels previously consigned to fiction, and self-driving is no different.

The most well-known of these is the software offered by Tesla. Elon Musk extolls the benefits of automated driving software on a regular basis, even going so far as saying that solving self-driving is the “the difference between Tesla being worth a lot of money or worth basically zero.”

Despite this, and Musk’s repeated insistence that his company is the market leader, the company does not release data on the performance of its software. There are two main branches of Tesla’s advanced driver-assisted technologies – ‘Autopilot,’ and the more recently released ‘Full-Self Driving’ (FSD) that seeks to upgrade the offering of Autopilot. The distinction between the two – perhaps intentionally – is not widely understood among the general public.

Autopilot functions only on highways and freeways, matching the speed of the vehicle to surrounding traffic and steers within clearly marked lanes. FSD, on the other hand, relies on visible light cameras and neural network training to develop an extensive dataset from the actions of Tesla drivers, to automate (under supervision) on all roads. In comparison to this, other companies in the sector utilise highly trained drivers, LIDAR, and simulated driving.

Safety data for Tesla’s software has been few-and-far-between. However, on June 15, the National Highway Traffic Safety Administration (NHTSA) released the first tranche of incidents involving vehicles utilising advanced driver-assisted technologies (ADAS). While it could be questioned whether this data should have been collected earlier, and despite caveats – Steven Cliff, NHTSA’s director stated “the data may raise more questions than they answer” – there are interesting points to note.

Tesla makes up the bulk of the 392 incidents in the data, with their vehicles utilising ADAS being involved in 273 crashes – approximately 70% of the 392 crashes analysed by NHTSA over the last 10 months. By comparison, Honda was involved in 90, Subarus in 10, and other manufacturers in 5 or fewer. In fact, Tesla has more than double the crashes of all other manufacturers combined. While Tesla vehicles are the most prolific on the roads, the number of incidents is still deeply concerning. The order was made by NHTSA on June 29, 2021, with the first report made on July 20. In 10 months, there were 273 crashes involving Tesla ADAS that “involved a vulnerable road user or resulted in a fatality, a vehicle tow-away, an airbag deployment, or any individual being transported to a hospital for medical treatment.”

Tesla crash in Laguna Beach, California
Tesla crash in Laguna Beach, California. (Laguna Beach Police Department)

Given the warning about drawing firm conclusions from the data, it is worth considering what is missing from it. Most notable, perhaps, is the data about what version of ADAS was involved in the incident. Only Toyota, General Motors, Porsche, Lucid, and Hyundai have disclosed the version in any of the incidents. By contrast, all of the entries for Tesla are marked “Redacted, May Contain Confidential Business Information,” with the data methodology noting that the requests were made by Tesla themselves. The same is for all 5 of Ford’s entries, the sole APTIV entry, and 1 of three of BMW’s. The question is, why are Tesla and these other companies permitted to have this information redacted. The public – and lawmakers – have a right to know which aspects of the ADAS manufacturer’s software are most dangerous, given that they are being tested on our roads and among the public.

Despite the caveats, we can draw certain conclusions from the data. A total of 46 of the incidents involving Tesla vehicles occurred on streets, rural roads, or intersections. The remainder are unknown, or occurring on highways/freeways. As Autopilot only works on highways/freeways, we can determine that at a minimum 46 of the incidents involved FSD. As this software is both purportedly the more advanced and operates in areas with pedestrians and cyclists, this is very worrying. Even those that occurred during the use of Autopilot are concerning. If Tesla’s software cannot competently manage speed-matching and staying within clear lanes, should they be trusted to unleash software attempting more complex manoeuvres on the roads?

Most fundamentally, we should not be having to puzzle our way to such an unsatisfactory and incomplete conclusion. If we are being expected to trust our safety to software such as this, both the public and the experts should be able to fully consider the data. Politicians are similarly concerned by the data. Following NHTSA’s release, Advocates for Highway and Auto Safety hosted a press conference with U.S. Senators Richard Blumenthal (D-CT) and Edward Markey (D-MA), U.S. Representative Jan Schakowsky (D-IL), the League of American Bicyclists and Joan Claybrook, the former administrator of the NHTSA.

The lack of federal standards for these vehicles was discussed, and the efforts of politicians to demand that Tesla and others comply with auto safety laws was front-and-centre. Sen. Markey highlighted the involvement of Tesla in five of the six fatalities, also noting the incomplete nature of the data. He stated that the report is “yet another signal that Tesla is running regulatory red lights and failing to take common-sense safety measures to protect the public.”

Rep. Schakowsky argued that “Autopilot means something that would suggest that you can just let go of it all. You just set it, and then you’re on your path and you don’t have to worry anymore. That is so far from the truth.” And Sen. Blumenthal argued that the NHTSA “should complete those investigations because lives are at stake.” While Claybrook noted that California was “one of the biggest public laboratories” for ADAS.

The discontent among Californians can perhaps be seen in the recent Senate run of Dan O’Dowd, the previously relatively unknown CEO of Green Hills Software. O’Dowd ran on a platform of seeking to “make computers safe for humanity,” and focused his campaign around a single issue: banning Tesla’s FSD. Despite this, and running against a Democratic incumbent, he secured 62,067 votes, or 1.1% of the vote.

O’Dowd continues to campaign against FSD under his organisation, The Dawn Project, and stated after seeing the data that “Tesla’s self-driving systems are the most dangerous consumer software in America, and pose a clear and present danger to all our communities. It has absolutely no place on our roads. We would not tolerate this degree of risk from any other technology, and we should not tolerate it simply because it is produced by a large and powerful corporation. Americans should not be treated as Tesla’s crash test dummies.” O’Dowd called on regulators and lawmakers to “put the public’s interest first and stop Tesla’s technology from putting even one more innocent person at risk.”

Musk has recently, and very publicly, shifted his political allegiance to the Republican Party. In a tweet, he cited the lack of support that he has received from the Biden administration. With pressure rising on Musk over his treatment of workers, and the opposition to Tesla’s ADAS – FSD in particular – likely to be emboldened by NHTSA’s data, he may have to get used to more direct scrutiny of his activities, and indeed direct political and public opposition, in the near future.