
Tech
Dan O’Dowd on Lies, a Hitler Salute and How Your Tesla Might Murder You
Dan O’Dowd is one of the world’s foremost experts in designing software that never fails and cannot be hacked. Over the past four decades, he has built secure operating systems for some of the most high-stakes projects in aerospace and defense, including Boeing’s 787 Dreamliner, Lockheed Martin’s F-35 fighter jet, the Boeing B1-B intercontinental nuclear bomber, and NASA’s Orion Crew Exploration Vehicle.
Since earning his degree from the California Institute of Technology in 1976, O’Dowd has been at the forefront of developing safety-critical systems and unhackable software, creating certified secure real-time operating systems used across industries. Dan is also the founder of both the Dawn Project and Green Hills Software.
Initially a fan of Tesla, O’Dowd grew alarmed after analyzing videos that revealed critical failures in the company’s Full Self-Driving (FSD) technology—instances where the system failed to recognize school buses and misinterpreted traffic signs. He likens Tesla’s approach to some of the most notorious corporate failures, from Ford’s Pinto gas tank fiasco to Takata’s deadly airbags. Unlike Tesla, O’Dowd argues, competitors such as Waymo have developed self-driving systems that are genuinely reliable. He also points to Elon Musk’s increasingly polarizing public persona and political controversies as factors undermining Tesla’s credibility and eroding its public image.

Scott Douglas Jacobsen: Thank you for taking the time to speak with me, Dan. When did you first begin to suspect that Tesla’s “Full Self-Driving” might be a misleading or inadequate description of what the system actually delivers in practice?
Dan O’Dowd: The realization came gradually. I was a fan of Tesla. I own eight Teslas myself. They’ve been the only cars I’ve driven since 2010—15 years. My wife has been driving a Tesla for 13 years, and it is the same Model S we bought back then. So, we were big fans of Tesla for a long time.
The first signs that things were not as represented came around 2016 when Elon Musk made bold claims that Tesla had solved the self-driving problem. He asserted that their system was safer than a human driver and announced they would demonstrate it. Musk described a trip where he would get into a Tesla at his house in Los Angeles, and the car would drive him across the country, drop him off in Times Square, and then park itself. He even gave a specific timeline for this demonstration six months later. I remember hearing that and thinking, “Wow, that’s exciting.” If Tesla could do that, they would have essentially solved autonomous driving.
So, I waited, and waited. The date came, and when people started asking about it, Musk said there had been some minor hang-ups and a few details to work out, but the demo would happen in another four to six months. I waited again. Then, that date came and went. People started asking about it again, but Musk stopped answering this time. There was no new timeline and no further updates. The entire project was quietly abandoned.
A year or two later, it became clear that the promised demonstration wouldn’t happen. No evidence supports the claims of having solved Full Self-Driving (FSD). Fast-forward to 2020 or 2021, and someone mentioned to me that I should look at the YouTube videos of Tesla’s FSD demos. These were real-world tests where people installed cameras in their cars and recorded the system.
I started watching the videos, and they were shocking. The cars were running red lights, rolling through stop signs, slamming on the brakes in the middle of the road, and doing all kinds of erratic and dangerous things. At first, I thought, “Well, every system has some bugs—it’s part of the development process.” However, to understand the problem’s scope, I asked one of my team members to analyze the videos.
We compiled a detailed report by counting the elapsed time and documenting the various failures in each video. The results were devastating. It became clear that Tesla’s Full Self-Driving system was far from Musk’s claims.
It said that the system would fail frequently—on average, every eight minutes, it would do something stupid. Over a longer period, like days, it would essentially crash. It would crash your car if you did not monitor it like a hawk and intervene to stop it. Yet, they’re delivering this product to ordinary people who want it and are willing to pay for it.
They started with a small number of users—about 100 initially—which didn’t seem like too many. Then, after about a year, they expanded to 11,000, then 60,000, and eventually to half a million people, which is where we are today. So, this product, which is supposed to be fully self-driving, has major flaws. For instance, if you turn it on and a school bus stops, puts on its flashing lights, extends its stop sign, and opens the door for kids to get off, the car won’t stop. It’ll zoom past the bus, even with children running into the road.
We created a Super Bowl commercial two years ago showing exactly this scenario. Several months later, in North Carolina, a child got off a bus and was hit by a Tesla operating on Full Self-Driving. It struck the child. The kid hit the windshield and ended up in the hospital for three months, on a respirator, with a broken collarbone and leg. The system does not recognize what a school bus is.
How can a company ship a product called “Full Self-Driving” that doesn’t even know what a school bus is? The system interprets a school bus with flashing lights as a truck with its hazard lights on. And what does a driver typically do when approaching a truck with its hazard lights on? You look around the truck to see if anyone is coming from the other direction. If the road is clear, you might slow down but ultimately go around the truck and continue driving. That’s exactly what Tesla’s Full Self-Driving does. It treats a stopped school bus like a truck with hazard lights—it drives past without stopping.
We aired that commercial, and someone asked Elon Musk about this issue, specifically about Teslas running over kids getting off school buses. Musk responded, “This will greatly increase public awareness that a Tesla can drive itself (supervised for now).” That was two years ago, and the problem still hasn’t been fixed. The system still doesn’t know what a school bus is.
We also ran a full-page ad in The New York Times and another Super Bowl ad to raise awareness. Musk hasn’t done anything about it. I’ve never seen any other company behave this way—except maybe a cigarette company. Companies like that deliberately sell products while telling people they’re healthy, safe, and good for them, even when not. Tesla’s behaviour is despicable. It’s hard to believe a company would act this way.
At this point, there’s no excuse for any of it. It’s the depths of greed and depravity. The right thing to do would be to take it off the road and fix it. I can’t imagine that if this were GM, Toyota, or BMW, they wouldn’t immediately assign 100 engineers to fix the problem. But as far as Musk is concerned, he’s not fixing it. Recently, he’s been focused on windshield wipers, which, by the way, still don’t work properly.
It cannot even properly handle windshield wipers—how can it drive a car? I’ve never seen an incomplete product sold to consumers, especially a safety-critical product. If this were some trivial app on a phone that occasionally failed, that would be acceptable. But this is a car, and people’s lives are at stake.
Over 40 people have already died in Tesla self-driving crashes. So, where do we go from here? Tesla is developing the software this way—“move fast, break things.” They keep doing it and continue shipping it to more and more people.
It’s hard to comprehend. I can’t imagine any respectable company doing this, yet Tesla does it daily. For instance, their system doesn’t even know what a “Do Not Enter” sign means. That should be an easy thing to program. A school bus might take additional work, but a “Do Not Enter” sign? It’s straightforward: don’t go here. The car doesn’t recognize the sign, doesn’t obey it, and will go the wrong way down a one-way street because it doesn’t understand what “Do Not Enter” or “One Way” signs mean. We’ve tested all of this, and the results are astonishingly bad.
How can you sell a product for $15,000 and tell people it’s 10 times safer than a human driver? Sometimes, Musk says it’s four times safer. The reality is that it’s not even close to the worst human driver on the road. Who’s the worst driver on the road? A 15-and-a-half-year-old with a learner’s permit must practice with a parent in the car. Even then, that kid must log 40 or 50 hours of road driving, and their parents must sign off that they’ve practiced.
Every parent who has gone through this knows how nerve-wracking it is to sit in the passenger seat while their kid learns to drive. But no sane person would sit in the passenger seat of a fully self-driving car with no one in control. No one would let it drive without being able to intervene. Elon Musk wouldn’t do it. The biggest Tesla fanboy wouldn’t do it. I wouldn’t do it.
Well, Arthur did it. He sat in the passenger seat to test it because we wanted to know if it would work. It does work—barely. We’ve got a great video of him sitting in the passenger seat while the car drives with no one in control. But that’s not something anyone would do willingly. Everyone would rather sit with their 15-and-a-half-year-old learner and not die.
Nobody sits in a Full Self-Driving (FSD) car with it in control, alone in the driver’s seat, without any ability to intervene. It is a far worse driver than any 15-and-a-half-year-old with a learner’s permit. Yet, Elon Musk claims it is safer than any driver—10 times safer than the average driver. And for what purpose? To get people to give Tesla their money. They’ve picked up billions of dollars selling this product, telling people it will revolutionize transportation and make Tesla the most valuable company in the world. That’s why Tesla is worth more than all other car companies combined—because FSD is supposedly so amazing and the best self-driving software in the world. Musk says it all the time.
Of course, except for competitors like Waymo, which has self-driving cars that have completed over 4 million paid trips. Amazon has Zoox, and two or three companies in China operate self-driving cars. The only company that doesn’t have self-driving cars is Tesla. And here we are.

Jacobsen: When considering similar failures in the automotive industry, what case would you point to as a meaningful comparison? Are there historical examples where a car manufacturer was aware of a serious defect yet failed to address it, even as public scrutiny grew?
O’Dowd: Yes. One example is the Ford Pinto gas tanks that exploded in crashes during the 1970s. Those failures caused fatalities, and Ford faced massive fines and public backlash. Tesla’s FSD has already been involved in more fatal crashes than the Pinto gas tank failures. Another case is the Takata airbag scandal from 10 years ago. Takata airbags caused fatalities due to exploding shrapnel. Tesla’s FSD fatalities have now exceeded the number of deaths caused by Takata airbags.
Another example would be Toyota’s sudden unintended acceleration issue from 15 to 20 years ago. People reported that their cars would suddenly accelerate out of control, leading to accidents and fatalities. Even in that case, the fatalities were fewer than those caused by Tesla’s FSD. These products—Ford Pintos, Takata airbags, and Toyota’s unintended acceleration—were either recalled or resulted in massive lawsuits and a significant reputational hit for the manufacturers. Yet Tesla’s FSD, despite its worse track record, is still on the road today, making money and boosting Tesla’s valuation.
Musk has directly linked Tesla’s valuation to FSD. He’s even said in a video that Tesla is “worth basically zero” without Full Self-Driving. With FSD, Tesla is valued higher than Toyota, GM, Ford, BMW, and Volkswagen combined despite having a tiny market share. Tesla’s sales declined last year, and FSD doesn’t deliver on its promises—it’s completely unsafe.
Jacobsen: How has the media generally responded when you’ve presented your findings in a measured, analytical way? I’ve seen a few interviews where you’ve laid out your case, but in at least one instance, the conversation devolved into a shouting match—instigated not by you but by the opposing side. What kind of pushback have you faced when presenting a clear, evidence-based assessment?
O’Dowd: There are generally two scenarios. One is when I’m debating a pro-FSD Tesla supporter. Those debates can get rather heated at times. The other is when we are presenting evidence to journalists or legislators. We have mountains of evidence—hundreds of videos showing exactly what we say. I don’t just go out there and make claims. I have a whole team, a staff that tests these systems ourselves. We analyze other reports and videos, and we invite people—journalists especially—to see it for themselves.
We tell journalists, “Do you want to see how this product works? Get in the car. We’ll take you for a drive.” Beforehand, we ask them, “Do you think this system is better than a human driver?” Everyone who gets out of the car afterward says, “No way. This isn’t even close to the skill of an average human driver.” It does crazy things. For instance, it will stop in the middle of railroad tracks and stay there. It will run red lights and stop signs.
We’ve taken high-profile individuals for these demonstrations. We took the Attorney General of California on a trip. We rented a school bus with a driver, set it up on the side of the road, and had the Tesla drive by as if the bus wasn’t there. People are understandably nervous. In one test, we used a mannequin designed to simulate a child stepping out from behind the bus. The Tesla ran it down without hesitation.
We’ve taken congresspeople and state senators on similar rides. We even went to Sacramento with a dozen legislators who wanted to see what this system does for themselves. We’ve invited journalists from many outlets, offering them the chance to experience FSD firsthand. We plan to go to Washington, D.C., to give senators and congresspeople similar demonstrations. Many of them hear from Elon Musk and his supporters about how “great” FSD is—that it’s supposedly the best technology in the world. But that’s Musk’s marketing machine at work. He has 200 million followers, many amplifying his claims and attacking anyone trying to expose the truth.
I’ve been called a murderer countless times for pointing out the flaws in FSD. When we started this campaign three years ago, the overwhelming sentiment was pro-Elon and pro-FSD. But things have shifted. Waymo hadn’t yet demonstrated its self-driving cars to the public. They were still under wraps. That made Tesla’s claims seem more credible.
Now, though, Waymo has been successfully running fully driverless cars. They’re doing 150,000 self-driving taxi rides per week. Over the past year, they’ve completed over 4 million rides—4 million times, people have gotten into a Waymo car without a driver, traveled to their destinations safely, and didn’t worry about the system failing. This happens daily in cities like Phoenix, San Francisco, Austin, and now Los Angeles. No one has been hurt. No one has been killed.
Meanwhile, Tesla’s FSD has been involved in at least 1,700 crashes, with 42 fatalities. Oh, wait, I’m told it’s now 44 fatalities—it keeps going up. The comparison couldn’t be more stark.
Jacobsen: You’ve mentioned the marketing machine behind Tesla and Elon Musk. Can you elaborate on how that influences the narrative surrounding Full Self-Driving (FSD) and its shortcomings?
O’Dowd: We’re up against one of the greatest marketing machines on Earth, selling a complete lie about this product. We’re doing our best to counter it; fortunately, more journalists and others are joining in. We even have a great video showing Elon Musk, year after year, looking directly into the camera and confidently claiming that Tesla will have Full Self-Driving working better than a human driver by the next year.
Every year for the last 10 years, he’s always made this claim with great emphasis and certainty. And every single year, it doesn’t happen. Then the next year comes, and he says it again. And again. He’s even saying it now. He’s claiming, “By the end of the year, for sure.” But it’s still pathetic. They haven’t even figured out how to handle something as basic as a school bus.
How can they claim they will roll this out globally when they can’t even handle school buses yet? It reminds me of the old joke in artificial intelligence research. If you ask someone when AI will arrive, they’ll always say, “10 years away.” And then, 10 years later, they’ll say the same thing. Musk does the same thing—except he says one year, every year, and expects people to forget. But the Internet now has a long memory.
We’ve compiled those clips of him making these claims year after year, and when you show the video to people, it has an effect. They’re shocked. It’s like, “Wow, this guy said that unequivocally, and he’s been wrong every time.” For example, in 2019, he claimed there would be 1 million robotaxis on the road by 2020. Where are those robo-taxis?
There are robo-taxis, though—just not from Tesla.
Waymo has robo-taxis from Google. But Tesla? Zero. That’s not entirely true, though, because in October, they held an event on the backlot of Warner Brothers. They brought in about 500 or 1,000 people, let them ride in Tesla cars, and called them “robo-taxis.” But the cars never left the Warner backlot. They drove around a fixed route late at night without traffic, lights, or obstacles. It wasn’t a real-world demonstration.
It was basically a 1950s Disneyland ride. At the same event, Musk unveiled robots that were supposedly bartending and serving drinks. Except those robots turned out to be remote-controlled by humans. People exposed this, and eventually, Musk admitted it. The robots weren’t autonomous. They were fake.
The entire event was staged. The so-called robo-taxis were just cars driving around a few blocks with no real-world challenges. The robots were human-controlled. It was all smoke and mirrors.
Musk said on Tesla’s Q4 2024 earnings call, “There is no company in the world that is as good in real-world AI as Tesla” and asked, “Who’s in second place for real-world AI? I would need a very big telescope to see them. That’s how far behind they are.” Tesla’s claims are laughable compared to Waymo’s, which conducts tens of thousands of rides per week in real cities with no drivers and no incidents. The difference is stark, yet Musk’s marketing machine convinces people otherwise.
Jacobsen: In light of the issues surrounding Tesla and Musk’s claims, this raises a larger question: to what degree are other CEOs of major corporations similarly inflating claims or outright spreading falsehoods about their products? How does Musk and Tesla’s approach fit into the broader multinational corporate image?
O’Dowd: This is far beyond anything I’ve ever seen. There is no functioning product. It simply does not work. Musk has been telling people for 10 years that it works, and he’s been selling it. He’s taken in billions of dollars from people buying this software—many also bought the car because of the promise of Full Self-Driving (FSD). The software alone has generated billions, but it does not work. He’s been trying for years to make it work; meanwhile, the competition has completely passed him.
In October 2016, Musk said, “All Tesla vehicles leaving the factory have all the hardware necessary for Level 5 autonomy.” Eight years later, during Tesla’s Q4 2024 earnings call, Musk admitted, “The honest answer is that we’re gonna have to upgrade people’s Hardware 3 computer for those that have bought Full Self-Driving.”
Companies like Waymo already have the very thing Musk claims he will deliver. It exists, it works, and it’s being used successfully. They’re selling it and making money from it. I’ve never seen anything like this in my life. There’s little difference between this and the Elizabeth Holmes case. Holmes claimed her device could run 100 blood tests from a single drop of blood. It didn’t. Similarly, Tesla claims it has a fully self-driving car but does not drive itself. How is that any different?
Of course, Theranos reached a $9 billion valuation, while Tesla’s valuation hit $1.4 trillion, largely based on FSD. That’s where the comparison diverges. No other company makes promises on this scale. Sure, automakers occasionally show concept cars with futuristic features that might be available in five years—or might not. But everyone understands that concept cars are aspirational. Musk, on the other hand, is delivering a product to consumers that doesn’t work, is unsafe, and is killing people.
Yet, he owns the public square. Remember, Musk owns one of the largest social media platforms. He has a direct link to 200 million people through his app, and he controls what is said there. Meanwhile, traditional news media outlets are in retreat—many have seen sales drop by 50%, and their subscriber bases are shrinking. Musk dominates the narrative, leveraging his platform and influence to shape public perception of Tesla and FSD.
Jacobsen: John Lyman suggested I ask you about the mounting scrutiny surrounding Elon Musk, particularly in light of Tesla’s ongoing challenges—safety concerns, declining sales, and the controversies surrounding the Cybertruck.
Compounding these issues, Musk’s increasing alignment with far-right ideologies—such as his endorsement of Germany’s Alternative für Deutschland (AfD), a party attempting to rehabilitate Hitler’s image—along with his erratic social media behavior and, most recently, a gesture that any reasonable observer would interpret as a Sieg Heil salute, have raised alarms.
Under normal circumstances, a CEO exhibiting this level of volatility would likely be forced out. Given Tesla’s situation, do you think the company could benefit from less polarizing leadership and not actively harming its brand? What are your thoughts on that assessment?
O’Dowd: He’s right about Tesla’s current situation. Their sales dropped last year, which is unusual because no other major car company I’m aware of experienced a decline—everyone else saw sales increase. Tesla’s market share also decreased. They only have two viable models, the Model 3 and the Model Y.
As for the Cybertruck, it’s a complete failure. They originally had 2 million reservations, but those didn’t translate into actual orders. Now, they’ve run out of pre-reservations. Of the Cybertrucks shipped, it’s been around 30,000—or even less. The 2 million reservations were mostly fake orders, with only tens of thousands becoming real purchases.
Meanwhile, inventory is piling up because the demand is far smaller than they expected. The Cybertruck is not a smart product—it’s a bad product. This was their first major innovation since the Model Y, which came out years ago. And yet, it’s going nowhere.
Tesla also has significant reliability issues. Major organizations like J.D. Power and Consumer Reports consistently rank Tesla near the bottom, not the top, for reliability and safety. Many experts have recommended against using their Full Self-Driving feature because it’s unsafe. Recently, Tesla has been linked to more fatalities than any other car brand, which is alarming.
Politically, Musk’s position has also hurt Tesla. His base was originally people who cared about reducing CO2 emissions and transitioning to a non-fossil-fuel economy. Now, Musk has shifted to the far right. The people who believed in him—those who saw Tesla as a way to save the planet—are saying, “Wait a minute, I don’t agree with these things Musk is saying.” Owning a Tesla is no longer seen as a statement about environmentalism; instead, it’s becoming associated with far-right politics.
This shift has led to a cultural backlash. Some Tesla owners now put bumper stickers on their cars that say, “I bought this before Elon went crazy,” to distance themselves from him and insulate themselves from criticism while driving a Tesla.
This has hurt the Tesla brand significantly. It’s not just in the United States, either. Musk’s approval rating in the UK was recently reported as 71% negative. He’s jumped into British politics, trying to influence the government, and people are not reacting well. Imagine if BMW came to the U.S. and attempted to sway elections by backing Democrats or Republicans. That wouldn’t go over well, and it’s the same situation here.
At a high level, Musk sees himself as untouchable, almost like a modern-day emperor. He operates as though laws don’t apply to him and no one can hold him accountable.
There are laws, but they don’t apply to him. He does all these things, and any other CEO would have been fired in a minute for them. It’s wild, but he gets away with it.
Why? Because his fanboys, shareholders, and board of directors have all made immense amounts of money off a product that doesn’t work. He keeps saying it works, keeps spending money to promote it, and somehow manages to sustain the illusion. But it’s taking a toll.
The Wall Street Journal released a poll today showing his favorability at -11 net approval: 40% positive, 51% negative. But that poll was taken before the Nazi salute incident. How much did that further damage his favorability? It’s significant.
Jacobsen: Thank you for your time, Dan.