Tesla’s Claim That Its Cars Are Self-Driving May Cross the Line From Permitted ‘Puffery’ to False Advertising

Tesla is selling a semi-autonomous driver assistance system under the name "Full Self-Driving." Is it fake-it-till-you-make-it or outright false advertising?

Dan Kiely, CEO & Co Founder of Voxpro, takes his hands off the wheel of his Tesla Model S P100D at a launch event for the MobilityX self-driving conference. On Tuesday, May 8, 2018, in Dublin, Ireland. (Photo by Artur Widak/NurPhoto via Getty Images)

No Tesla (TSLA) car for sale today is capable of driving itself. And to be fair, neither are any consumer cars made by its competitors. But the words “self-driving” and “automatic” have long been embedded in the branding of Tesla’s driver assistance products, such as Autopilot and Full Self-Driving softwares. Critics have questioned the integrity of Tesla’s marketing practice, and legal experts say the company’s claims may have crossed the line of false advertising.

“Sometimes I feel I’m a victim of Elon Musk,” said Johannes Himmelreich, a professor at Syracuse University who studying the ethics of emerging technologies. Himmelreich said jokingly he’s been waiting for Tesla to deliver a self-driving vehicle to buy a new car but it just never happens. “Elon always says full self-driving is just around the corner. According to him, it should have happened four years ago.”

The controversy recently escalated when the California Department of Motor Vehicles accused Tesla of deceptive marketing and threatened to revoke its sales license in California if the company fails to comply with the regulator’s requirements.

What is FSD and why does Tesla call it something that it’s not?

Currently, all new Tesla cars come with a free driver assistance program called Autopilot, which includes features like adaptive cruise control and lane centering. Customers have the option to upgrade, for $15,000 up front or $199 per month, to a more advanced package called Full Self-Driving (FSD). FSD can perform complex tasks, such as automatic lane-changing, traffic light recognition on a pre-programmed route and “smart summon,” a feature that allows a driver to summon their car from a parking spot to come pick them up.

Both Autopilot and FSD are classified as level 2 autonomous driving systems under the standards of the Society of Automotive Engineers (SAE). The SAE defines six levels of driving automation ranging from level 0 (fully manual) to level 5 (fully autonomous). These standards are adopted by the U.S. Department of Transportation. A level 2 designation means a human driver needs to stay alert at all times during a vehicle’s movement. Other company’s competing softwares, including General Motors’ Super Cruise and Ford’s Co-Pilot360 ADAS, are also level 2 autonomous driving systems.

Tesla doesn’t have an advertising or marketing department, according to The Org, a database of organizational charts of tech companies. It’s not unreasonable to speculate that branding decisions, such as naming a driving software, ultimately come from the top—its CEO—and are not open to differing opinions. On online discussion boards, current and former Tesla employees have described a “fearmongering” company culture and “a heavy command-control vibe from senior leadership” in the office, according to private posts on Team Blind, a networking site for verified tech workers, obtained by the Observer. Tesla, which also doesn’t have a media relations department, could not be reached for comment.

Tesla appears aware of the limitations of its software and how its branding might mislead consumers. Tesla released the first version of FSD, called “FSD Beta,” in October 2020 to a small group of customers. Five months later, when the company expanded access to more drivers, Musk cautioned in a tweet, “The word ‘Beta’ is used to reduce complacency in usage and set expectations appropriately.” In small print on its website, Tesla states that a human driver must pay full attention while using Autopilot or FSD capabilities.

But it seems Tesla has adopted a fake-it-till-you-make-it approach. Musk has on multiple occasions said Tesla is close to achieving level 5 automation. At an artificial intelligence conference in July 2020, for example, he told an audience Tesla had the hardware to enable level 5 autonomy and that it “will happen very quickly.”

“It’s possible that Musk truly believes it, because in many ways he lives in the future and has lost touch with reality—economically anyway,” said Himmelreich. “But the worst part is that it ruins the meaning of these words for the rest of us. It makes it much harder to trust other self-driving car companies.”

In fact, higher level autonomous driving systems do exist. Waymo, a spinoff of Google’s self-driving car unit, operates a fleet of robotaxis in Phoenix, Arizona that are rated as level 4 autonomous driving, meaning a vehicle can drive itself under limited geographic conditions and doesn’t need human supervision. Waymo CEO John Krafcik has scoffed at Musk’s optimism that FSD could one day live up to its name. “It is a misconception that you can simply develop a driver-assistance system further until one day you can magically jump to a fully autonomous driving system,” he told a German magazine in an interview January 2021. (Musk responded in a tweet Tesla has more money and better technology than Waymo.)

Is it false advertising, then?

Businesses make exaggerated claims about their products all the time. It’s legal as long as the claims cannot be objectively verified, said William Scott Goldman, an advertising law attorney in Washington D.C. In fact, there is a legal term for such practice called “puffery.” An example of puffery would be promoting something as “the best in the world” or “No.1 in the country.” But knowingly labeling a software as “autopilot” or “full self-driving” when it’s actually a level 2 driver assistance program may have crossed the line of false advertising, Goldman said.

In 2020, a court in Munich, Germany ruled that Tesla had violated the European Union’s advertising law with the marketing of its Autopilot software. (FSD wasn’t available in Europe at the time.) However, Tesla appealed the ruling and successfully had the verdict reversed in a higher German court in October 2021 on the condition that Tesla change some of the language on its website to clarify Autopilot’s capabilities.

Tesla hasn’t faced similar legal battles in the U.S., despite having sold Autopilot for six years and FSD for two. “It’s a little bit surprising that no consumer advocacy groups or individual consumers have brought up a false advertising lawsuit against Tesla,” said Goldman.

That could change soon with the California DMV’s recent complaint, in which the regulator argues the “Autopilot” and “Full Self-Driving” labels could mislead consumers into believing that Tesla cars can drive themselves, according to a filing to California’s Office of Administrative Hearings, a state administrative body, on July 28.

“Usually when a government entity brings up an issue like this, it will lead to a slew of consumer class actions. That’s probably a bigger concern for Tesla,” said Jason Stiehl, an attorney at Crowell & Moring, a Washington, D.C.-based law firm, specializing in consumer class actions and advertising disputes.

In a letter to the California DMV on Aug. 12, Tesla requested a hearing to present its defense against these accusations. The DMV is currently in the discovery stage of the case and could negotiate a settlement with Tesla before a hearing, whose date has yet to be determined, said a spokesperson for the agency.

If the DMV prevails, Tesla could lose its sales license in California. In the complaint the regulator said it demands that Tesla honestly advertise to consumers and better educate Tesla drivers about the actual capabilities of Autopilot and FSD.

Tesla’s Claim That Its Cars Are Self-Driving May Cross the Line From Permitted ‘Puffery’ to False Advertising