Tech Billionaire Dan O’Dowd Owns 5 Teslas. Now He’s Waging a War Against the Company.

According to O'Dowd's research, self-driving Teslas don't recognize common traffic signs like “one-way,” “do not enter” and “no turn on red."

Dan O'Dowd
Dan O’Dowd said Tesla FSD is the worst piece software he’s ever seen in a safety-critical product. Courtesy of Dan O'Dowd

On June 22, investor Ross Gerber drove his black Tesla (TSLA) Model S from his home in Malibu, Calif. to Santa Barbara to meet with tech entrepreneur Dan O'Dowd. The purpose of their meeting was to test out Tesla’s Full Self-Driving (FSD) software, an advanced driver assistance system installed in more than 400,000 Tesla cars in North America. For nearly two years, O’Dowd had been vehemently advocating for banning FSD on public roads, claiming the software is riddled with serious defects and poses dangers to public safety. Gerber, who owns more than $70 million worth of shares in Tesla, was there to prove O’Dowd wrong.

Sign Up For Our Daily Newsletter

By clicking submit, you agree to our <a href="http://observermedia.com/terms">terms of service</a> and acknowledge we may use your information to send you emails, product samples, and promotions on this website and other properties. You can opt out anytime.

See all of our newsletters

O’Dowd and Gerber got into Gerber’s car and took off on a roughly 30-minute route from O’Dowd’s home to his office in downtown Santa Barbara. With its FSD mode on and Gerber merely placing his hands on the steering wheel, the Model S stopped for 13 stop signs it encountered, slowed down for six construction workers it drove by, and yielded to eight street-crossing pedestrians, according to a live stream of the test drive. It sure looked like the car would have been capable of driving itself even without a human driver behind the wheel.

However, at one intersection, the self-driving Tesla failed to brake for a stop sign and nearly missed hitting an SUV. (Gerber took over at the last second and stopped the car.)

In a later test drive that day without Gerber, O’Dowd took a Tesla Model 3 for a 15-minute drive in a school district nearby. The test video showed the FSD-engaged car cruising past a school bus with a stop sign sticking out and lights flashing without slowing down multiple times.

“It drives like a drunk,” O’Dowd described a self-driving Tesla in an interview with Observer. “It can’t drive a few miles in a city without getting pulled over.”

A costly campaign to stop self-driving Teslas

O’Dowd, the founder and CEO of Green Hills Software, a tech supplier to major auto and aircraft makers, is on a lonely, and expensive, quest to prove Tesla FSD is a road killer.

At his company headquarters, O’Dowd has converted an office space into a media room and employs about a dozen workers solely focusing on the mission, officially known as the Dawn Project. His staff consists of engineers, video editors, public relations specialists and other professionals.

In the past two years, O’Dowd’s team has conducted a number of test drives that apparently showed FSD doesn’t recognize common traffic signs like “one-way,” “do not enter” and “no turn on red.” It also doesn’t seem to understand speed limit signs and toll booth gates, those tests suggested. The only traffic sign FSD seems to understand, O’Dowd said, is a stop sign.

“This product is so far from being finished in development. There are so many terrible holes in it,” O’Dowd told Observer. “It’s inconceivable a large corporation would do that.”

In February, O’Dowd purchased a 30-second Super Bowl commercial showing an FSD-engaged Tesla making various errors, including entering a closed street, swerving into oncoming traffic, driving on the wrong side of the road, ignoring a school bus with a flashing stop sign, and running over a mannequin meant to represent a child crossing the street. In November 2022, O’Dowd placed a full-page ad in the New York Times warning against the dangers posed by FSD. Earlier that year, he ran for a California U.S. Senate seat on the very issue of banning FSD but didn’t get elected.

O’Dowd’s anti-FSD campaign has prompted strong responses from Tesla and its loyal legion of fans. In August 2022, Tesla sent a cease-and-desist letter to the Dawn Project, demanding the group remove a video from its website showing a Tesla car knocking over child-sized mannequins. Some Tesla supporters claim O’Dowd’s tests were staged and inauthentic. Others question his motive, pointing out that Green Hills sells software to Tesla’s competitors, including Mobileeye, a company making chips for autonomous driving systems.

O’Dowd asserted the Dawn Project’s test drives were conducted by qualified professionals and dismissed claims that he’s going after Tesla to benefit his own business as “silly arguments,” because Green Hills already works with almost every major automaker in the market, he said. O’Dowd said Green Hills doesn’t have business with Tesla and he has never held a financial position in the EV maker.

O’Dowd said he’s never spoken directly to Musk. Their last interaction happened in January 2022 on Twitter, when Musk in a post called O’Dowd’s software company “a pile of trash.”

A loyal Tesla customer turned its No.1 critic

It would be inaccurate to characterize O’Dowd as a Tesla hater. In fact, until recently taking issue with FSD, he had been a loyal customer of the Elon Musk-led company. O’Dowd owns four Teslas—two Roadsters and two Model 3s—and no other vehicles. His wife has been driving the same Model S since 2012.

“They are the best in the world,” he said of his Roadsters, Tesla’s first-ever production car, with a proud smile.

However, coming from an engineering background of building highly secure software for corporate and government clients, O’Dowd finds the flaws in Tesla’s autonomous driving product and the “move fast and break things” Silicon Valley mentality behind it intolerable.

“They are using Silicon Valley methodologies to develop self-driving software and shove it out before it’s ready,” O’Dowd said. “I’m not saying that’s wrong for them, because that’s how you make a trillion dollars. But that’s not acceptable for safety-critical situations like driving a car on public roads.”

O’Dowd’s Green Hills Software builds operating systems used in military aircraft like F-16s and B-52 nuclear bombers. Its clients also include NASA, Boeing and major carmakers. O’Dowd cofounded the company in 1982 after graduating from the California Institute of Technology and owns most of it, according to the Washington Post. Green Hills was valued at nearly $1 billion in 2019.

The real issue with FSD that no one’s talking about

Tesla FSD is classified as a level 2 autonomous driving system under the Society of Automotive Engineers (SAE) standards adopted by the U.S. Department of Transportation. The SAE defines six levels of driving automation ranging from level 0 (fully manual) to level 5 (fully autonomous). A level 2 designation means a human driver needs to be ready to take over any time during a vehicle’s movement.

In its user manual, Tesla does state a driver needs to stay alert at all times when FSD is engaged. But CEO Musk has suggested on multiple occasions the software is safer than a human driver and is rapidly improving, which may have encouraged many Tesla drivers to lower their guard when using FSD.

In September 2022, the California Department of Motor Vehicles accused Tesla of deceptive marketing by branding a level 2 driver assistance software “full self-driving.” But the real issue, O’Dowd said, is not Tesla naming a product something it’s not, but the fact that FSD acts like a level 5 fully autonomous system when it can’t really drive a car.

Tesla currently sells FSD as an optional add-on for $15,000. The company also offers a less advanced driver assistance system called Autopilot, which includes more basic features like adaptive cruise control and lane centering. Autopilot and FSD are both level 2 systems, but they behave quite differently: a typical level 2 system will alert the driver when something is wrong, whereas FSD tends to make decisions for the driver.

“When FSD is engaged, all other driver assistance functions are turned off and the car doesn’t tell the driver what’s going on anymore,” O’Dowd said. “The system doesn’t require human intervention to start driving.”

Level 2 driver assistance systems are installed in millions of cars in the U.S. and there is little regulation on their use. O’Dowd believes, since FSD clearly acts like a higher-level system, it should be subject to more restrictions on public roads.

In a televised debate with O’Dowd on CNBC on June 29, Gerber argued FSD shouldn’t be limited because A.I.-powered driver assistance systems like that need to learn from driving on real roads in order to improve.

Officially, Tesla FSD is still in the Beta stage, meaning it’s not perfect. The National Highway Traffic Safety Administration (NHTSA) has an active investigation into the safety of Autopilot and FSD. The agency’s latest data release in June showed Tesla Autopilot has been involved in 736 crashes and 17 fatalities since 2019—far more than previously reported. An increase in Autopilot-related injuries and deaths coincided with Tesla’s expansion of FSD to more cars, although the NHTSA didn’t release FSD-specific data.

O’Dowd estimates he has spent more than $10 million on the Dawn Project in less than two years and has no plans to stop. His objective is to either push Musk to fix FSD’s many problems or urge the NHTSA to ban the software.

“They are not minor problems. They are severe problems. If Tesla is unable to fix them, it’s time to give up. If they can fix them but are not willing to do so, then we have a serious safety cultural problem,” he said.

Tech Billionaire Dan O’Dowd Owns 5 Teslas. Now He’s Waging a War Against the Company.