Dan O’Dowd says Tesla’s ‘Full Self Driving’ software shouldn’t be on the road. He’ll keep running over test dummies until someone listens.
The third Tesla is crucial for an unusual hobby: O’Dowd is waging a multimillion-dollar campaign to get Tesla’s Full Self-Driving software off the roads — before Tesla CEO Elon Musk follows through with plans to make the tech available worldwide by the end of the year.
O’Dowd, who made his fortune selling software to military customers, has been using the Model 3 to test and film the self-driving software. He’s documented what appear to be examples of the car swerving across the centerline toward oncoming traffic, failing to slow down in a school zone and missing stop signs. This summer, he triggered an uproar by releasing a video showing his Tesla — allegedly in Full Self-Driving mode — mowing down child-size mannequins.
“If Tesla gets away with this and ships this product and I can’t convince the public that a self-driving car that drives like a drunken, suicidal 13-year-old shouldn’t be on the road, I’m going to fail,” O’Dowd said in an interview from his Santa Barbara office, where glass cases display his collection of ancient coins and auction-bought mementos from NASA moon missions.
How auto regulators played mind games with Elon Musk
O’Dowd has run nationwide TV ads with the videos and even launched an unsuccessful campaign for the U.S. Senate as part of his one-man crusade to challenge what he sees as the cavalier development of dangerous technology. For O’Dowd and other skeptics, the program is a deadly experiment foisted on an unsuspecting public — a view underscored by a recently filed class-action lawsuit and a reported Department of Justice investigation into the tech.
Despite O’Dowd’s high-profile campaign, and the concern from some regulators and politicians, Tesla is charging ahead with what it claims is world-changing technology. The company and its supporters argue their approach will help usher in a future in which death from human errors on roadways is eliminated. At the end of September, during a four-hour event in which Tesla showed off its latest artificial intelligence tech, Musk said Full Self-Driving is already saving lives and keeping it off public roads would be “morally wrong.”
“At the point of which you believe that adding autonomy reduces injury and death, I think you have a moral obligation to deploy it even though you’re going to get sued and blamed by a lot of people,” Musk said. Musk and Tesla, which does not typically answer media inquiries, did not respond to requests for comment.
Musk’s Trump-style management rattles Twitter workers awaiting layoffs
Readying Full-Self-Driving for worldwide deployment by the end of next month adds to Musk’s heavy workload. On top of running Tesla and his other companies SpaceX and Neuralink, he now owns and runs Twitter. Musk has also roped in Tesla engineers to help work on Twitter.
O’Dowd’s quest has prompted personal attacks and criticism from Musk’s legions of supporters. Tesla sent O’Dowd a cease-and-desist letter in response to his child mannequin test. And this week, O’Dowd says a new ad he tried to run on Twitter was rejected.
But Tesla fans aren’t his only critics. Safety experts have questioned whether rogue independent testing is the right way to spur tougher regulation.
“They’re doing it on public roads. It still raises the same ethical issues that you’re putting other people at nonconsensual risk,” said Phil Koopman, an associate professor at Carnegie Mellon University who has studied autonomous car safety for years. “I’m not a fan no matter who it is.”
“Consumers should never attempt to create their own test scenarios,” said National Highway Traffic Safety Administration spokeswoman Lucia Sanchez.
O’Dowd’s view that software should be developed methodically, ensuring it’s fully secure before releasing it, is a stark contrast to the “move fast and break things” mantra that allowed companies like Google, Facebook and Amazon to become the behemoths they are today. Tesla has taken that mantra and applied it to public roads, he believes. O’Dowd’s critics allege his campaign against Tesla is self-serving: One of his customers — Intel-owned Mobileye — makes a computer chip that runs driver- assist and self-driving software. O’Dowd says Mobileye is just one of hundreds of customers and his motivation is driven purely by his concerns about the safety of Tesla’s tech.
Tesla cars have had a range of driver-assist features for years, like lane-keeping and automatic braking, called Autopilot. Millions of vehicles in the U.S. use advanced drive-assist features, and more than 60 percent of the cars sold in 2021 had lane-keeping, according to research firm Canalys. Data released by NHTSA in June showed that Tesla cars were involved in almost 70 percent of the 392 crashes involving advanced drive-assist features reported over the previous 11 months. NHTSA also recorded six deaths dating back to 2019 involving the features, five of which were tied to Teslas. The data set doesn’t account for how different automakers collect data, making direct comparisons with other manufacturers difficult.
But the company’s Full Self-Driving Beta program, which is currently available to about 160,000 drivers in the U.S. and Canada, goes further than anything else used by regular drivers on public roads, giving the car the ability to navigate through city and residential streets, stop for red lights and make turns on its own while following a mapped route.
“Everybody thinks I’m exaggerating, but I have literally never seen a worse program in my life,” O’Dowd said.
As a teenager in suburban Michigan, O’Dowd found a passion for coding, learning how to program on an IBM computer.
The year after graduating from the California Institute of Technology, he designed his first “debugger,” a computer program that can run through another set of code and find its flaws. The process of methodically fixing errors fascinated him, and set off a lifelong mission against what he sees as an epidemic of terrible, bug-ridden software infiltrating the world.
Heads up: Yet another Chinese rocket is falling back to Earth
In 1982 he founded Green Hills Software in Pasadena, Calif., to make operating systems for the tiny computers that were beginning to pop up inside industrial machines, planes, ships and trains. His super-secure approach won over the U.S. military, and his first big sale was to the maker of the B1-B supersonic bomber.
O’Dowd is no stranger to public fights. Two decades ago, he wrote several blog posts saying that Linux, the free, available-to-all operating system that was beginning to threaten his business, was dangerous because foreign spies could insert malicious code into it. His warnings mostly fell on deaf ears. Today, Linux is everywhere, and Green Hills’ own software is capable of interacting with Linux programs. But O’Dowd’s position hasn’t changed. Linux may be fine for tools like thermostats, but when it comes to planes, power plants and cars, he insists it’s still dangerous.
O’Dowd owns most of Green Hills, which was worth just shy of a billion dollars in 2019 when he sold a $150 million chunk of it. Beyond several houses in tony Santa Barbara, he doesn’t want to spend his money on the yachts and private islands other wealthy tech founders seem to delight in.
In the past, O’Dowd bought historical artifacts — he owns the world’s most expensive coin collection, and has both a Nazi-era Enigma encoding machine and a nearly intact fossilized Tyrannosaurus rex skull displayed in his offices.
As he got older, O’Dowd said he began looking for a more lasting project to spend his money on. About a year ago, a colleague told him about Tesla Full Self-Driving videos online. O’Dowd spent hours watching them, and said he was shocked by what he saw.
“Why would you put a car on the road that does something that would be illegal, that would cost you your license?” he said. He decided to make Tesla’s software the centerpiece of a bigger campaign he has named the Dawn Project to expose what he sees as incomplete and unsafe software being used in cars and power plants and in cybersecurity.
In early 2016, a year after Tesla first launched Autopilot, Musk said the tech was already “probably better” than a human driver and predicted that in two years his cars would be able to drive themselves across the country.
That hasn’t happened yet, but the company has steadily upgraded Autopilot over the years, allowing drivers to give up more control over their vehicles, though Tesla insists they must stay alert at all times. Full Self-Driving Beta represents the latest version of Tesla’s tech, and is available to Tesla drivers who’ve already paid $15,000 for the regular version of Full Self-Driving and have a good “safety score” as measured by Tesla’s in-car software.
In an interview with a Tesla owners group posted in June, Musk said successfully building self-driving software is “the difference between Tesla being worth a lot of money and being worth basically zero.”
Autopilot technology has spurred several government investigations, including one from NHTSA that is looking at whether the tech played a role in Teslas crashing into parked emergency vehicles.
Overall though, Tesla has benefited from a U.S. regulatory vacuum, where there are no rules against putting new driver assistance software out on public roads as long as the automaker specifies that the driver stay alert at all times.
Auto safety rules were written well before the advent of cars that can make their own decisions, said Koopman, the auto regulations expert. The government moves at a “glacial” pace, and the rules have in some ways become more permissive, Koopman said. In 2022 NHTSA said cars don’t need to have a steering wheel and pedals any more to meet safety standards.
“The regulatory pressure has been to make it easier, not harder,” he said.
NHTSA says that no vehicle available for purchase is self-driving, and drivers are always required to stay attentive, regardless of what technology their cars have.
O’Dowd’s strategy is straightforward: Find proof that Tesla’s software makes serious mistakes and get that evidence in front of the public and regulators.
He kicked off his campaign in January with a full-page ad in the New York Times, which alleged that the software makes a mistake every 8 minutes, based on analyzing dozens of hours of YouTube footage. The ad demanded that regulators ban Full Self-Driving from American roads.
Then came his Senate campaign, something he said was partly motivated by the fact that campaign advertising laws would make it easier for him to get his message out if he was technically running for public office. He’s spent millions of dollars on ads that play on TV channels across the country.
Rivian sold Wall Street on an electric SUV. Then reality hit.
The ads pull from videos of Tesla drivers having to intervene when their cars running Full Self-Driving Beta make mistakes. Another set of ads compares Musk’s predictions for when his cars would be able to drive themselves to the reality that they still need close driver supervision. (One of O’Dowd’s ads quotes text from a Post story. The Post was not involved in the ad’s production.)
Next, O’Dowd began shooting his own videos. He hired a driver for his new Model 3 to do tests on public roads and closed courses. At one point, O’Dowd was in the passenger seat when the vehicle began crossing the centerline just as another car was coming in the other direction. The test-driver grabbed the wheel and averted a head-on collision, according to O’Dowd’s recollections and dash-cam video reviewed by The Post.
“It almost killed me,” he said. “It’s personal now.”
In addition to his ads and online videos, he sends the videos to NHTSA, imploring the agency to take action. Besides confirmation of receipt, he hasn’t heard more from the agency.
NHTSA looks at all relevant information in its investigations, Sanchez, the agency spokeswoman said.
O’Dowd has set up an entire media operation, converting a large room in his company’s office building into a TV studio, complete with green screen and high-speed upload link. He’s hired public-relations professionals and video editors.
The campaign has gotten attention from Musk’s many followers. Online, some call O’Dowd by the clunky nickname “O’Clown.” Musk himself has referred to O’Dowd on Twitter with the emoji for bat and poop — suggesting he’s crazy — although there doesn’t seem to be much direct interaction between the two men. O’Dowd says they’ve never spoken.
In August, O’Dowd released the video of his Tesla repeatedly hitting the child-size mannequins. He set up the test at a closed course, then had his driver go down a lane of orange safety cones and engage Full Self-Driving Beta. When the video came out, skeptics accused him of rigging the test by not turning on Full Self-Driving or by overriding it by having the driver press down the accelerator pedal. O’Dowd says the test was legitimate and has begun running more, with video cameras filming from more angles to back it up.
Hours after O’Dowd released video of his test, one of the best-known Tesla fans put out a call on Twitter for a real child so that he could run the test himself. In the test by Omar Qazi, a 28-year-old software engineer who goes by the name Whole Mars Catalog and frequently clashes with O’Dowd and other Tesla critics, the car spotted the child and wouldn’t move forward.
Qazi views O’Dowd as just another anti-Tesla character, standing in the way of progress but doomed to fail.
“This is something that can’t be stopped by Dan or anyone else,” said Qazi, who owns Tesla shares but says they make up a small part of his overall investments. “I think putting something out now, even if it’s imperfect, it really has a lot of benefits.”
O’Dowd said he wants to do the tests again, this time with media, regulators, Tesla supporters and even Musk himself in attendance, so he can erase whatever lingering doubts there may be about his methods.
Qazi, for his part, isn’t worried O’Dowd will get the program banned any time soon, despite his incessant lobbying.
“Dan O’Dowd has been talking their ears off, everyone who will listen, and they have not banned it,” he said.
Faiz Siddiqui contributed to this report.