Tech
The Ethical Implications of Elon Musk and Tesla’s Contempt for Public Safety
When Caltech presented me with the Distinguished Alumni Award in 2020, they quoted something I had written many years ago. It was long before I was fully aware of Tesla’s reckless deployment of experimental Full Self-Driving (FSD) software on public roads.
“I work on technology, but I also have to get people to understand the fundamental ethical problem so that the decision-makers make the right decisions. When I was a student at Caltech, there was a course on engineering ethics that taught us that we could design a product that wasn’t very good or of low quality but that we must say no at the point we realize a device is an outright danger to humanity. To refuse to make a bad product is an acceptable moral choice, but a potentially insufficient one. At some point, you have to call it out, become the whistle-blower, and foster support from other people if you believe there is a tragedy or catastrophe in process,” I wrote at the time.
The public trust their lives to engineers, and as engineers we have a responsibility to make our products safe. This is a guiding principle of engineering. This is what we were taught at Caltech.
Despite his claims to the contrary, Elon Musk is not an engineer so he doesn’t understand this.
I am now in the position that I have to blow the whistle on Tesla’s Full Self-Driving software. Something must be done to stop Elon Musk from applying the Silicon Valley development methodology of ‘fake it until you make it’ and ‘move fast and break things’ to safety-critical software. This method for developing software is not acceptable for a technology that people’s lives depend on.
Elon Musk and Tesla have shipped a bug-riddled engineering prototype of a safety-critical product to millions of safety-conscious consumers by telling them that it is four times safer for them and their families than driving manually. Tesla and Elon Musk’s reckless self-driving experiments on public roads have already squandered the lives of dozens of consumers who only wanted to make their families safer.
Tesla hasn’t even implemented many of the features that are required for Full Self-Driving, such as: recognizing and obeying traffic signs, not running down kids in crosswalks, not blowing past stopped school buses with their flashing lights on, obeying school speed limits, always driving on the right side of the road, and driving in reverse.
Both my safety advocacy group The Dawn Project and others have been reporting these safety defects to Tesla for years, and Tesla refuses to fix them! They seem to prioritize gimmicks, such as their “Autopark” feature, over fixing safety-critical defects.
We have spent thousands of hours testing the safety of Tesla’s self-driving software, and our tests demonstrate that Tesla Full Self-Driving is much less safe than driving manually.
Tesla’s mileage statistics support this fact. Consumers who paid up to $15,000 for Tesla’s Full Self-Driving software turn it off 85% of the time. Despite Elon Musk betting the future of Tesla on autonomy, the automaker’s flagship self-driving technology can’t even pass a DMV driver’s test, putting Full Self-Driving behind even the worst 16-year-old with a license.
When you figure out that Musk and Tesla have been lying to you about the safety of Full Self-Driving, you too will develop the same animosity for them.
Elon is setting an example to the technology industry and millions of impressionable fans that they can become the richest person in the world and build a trillion-dollar company by replacing safety-critical engineering principles with Silicon Valley software development practices.
We must be more concerned by the grave ethical implications of Tesla’s reckless self-driving experiment on public roads. The technology sector cannot throw caution to the wind and disregard the foundations of ethical engineering when lives are on the line. Elon Musk and Tesla must be brought to heel.