The viral Tesla FSD video shows why human drivers are a big problem


In 2024, Tesla finally admitted what most people who used the technology already knew: fully self-driving was not the Level 4 autonomous driving experience it had been promising for years.

The company was forced to add “supervised” to FSD’s official name.

SAE International (formerly the Society of Automotive Engineers) considers advanced driver assistance systems, such as GM’s Super Cruise and Tesla’s fully self-driving, to be Level 2 automation, which requires the driver to remain engaged.

Anything above level 3 is considered truly “independent”. This means no human intervention is required when the system activates features such as lane assist and automatic braking.

However, the system must be activated by the current driver that must be captured at the time of the request. JD Power lists Mercedes DrivePilot as a Level 3 system.

This is not what the Tesla FSD (monitored) is designed to do. While the term fully self-driving (monitored) is an oxymoron, the monitoring component is extremely important, as it requires the driver to pay attention to the road, even when the software is busy.

The California Department of Motor Vehicles in July 2022 accused Tesla of making misleading statements about FSD and Autopilot. Afterward, the California DMV threatened to revoke Tesla’s vehicle dealer and manufacturing licenses, according to Botany and Botany.

In February, Tesla sued the California DMV to reverse a ruling that the company engaged in false advertising by using the terms “autopilot” and “fully self-driving” to describe its technology.

A viral video out of California over the weekend showed just how dangerous the misconception that FSD can actually be for people on the road is full self-driving.

Tesla CEO Elon Musk has claimed that drivers using fully self-driving technology will be able to sleep while their vehicles drive them safely to their destinations. Photo by Alexander Shapovalov at Getty Images
Tesla CEO Elon Musk has claimed that drivers using fully self-driving technology will be able to sleep while their vehicles drive them safely to their destinations. Photo by Alexander Shapovalov at Getty Images ยท Photo by Alexander Shapovalov at Getty Images

Just because Tesla is fighting California over its claims about what the company’s assisted-driving tech can actually do doesn’t mean CEO Elon Musk will stop building cars with the feature.

A March 3 post on X (formerly Twitter) shared a clip of an interview in which Musk claimed that Tesla drivers would be able to sleep and get their vehicles to their destinations safely.

Some X users have voiced how dangerous Musk’s exaggeration can be for current drivers, and a recent viral video shows the real-world consequences of his blaster.

Related: Tesla Loses Critical Autopilot Rule That Could Cost Hundreds of Millions

A video from the 10 Freeway in Calton, California appears to show a Tesla driver falling asleep at the wheel while the car is traveling down the freeway.

The person who recorded the video at 3:30 p.m. on Sunday, March 1, told ABC7 they immediately called police about the driver, but the California Highway Patrol said they were unable to locate them.

Last August, a Florida jury ruled that the family of Naybel Benavides and crash survivor Dillon Angulo are entitled to nearly $4 billion in awards after driver George McGee crashed his Tesla into the car they were standing outside of.

McGee testified that he was on autopilot when he killed Benavides, 22, in Cal Largo in 2019, but that he also had his eyes off the road while looking at the cell phone he dropped.

Related: The look of the new Tesla almost seems too good to be true

“Tesla in the showroom is telling you that they have invented the greatest fully self-driving car the world has ever seen,” Brett Schreiber, the plaintiffs’ trial attorney, said at the time.

“Mr. Musk has been pitching to consumers and investors for a decade that cars are fully self-driving and that the hardware is fully autonomous. And those statements were as false then as they are today.”

U.S. District Judge Beth Bloom upheld the jury’s initial verdict on Feb. 20, saying the evidence at trial “more than supported” the verdict and that Tesla had not presented any new arguments to warrant a review.

On December 16, 2025, Juliet E. Cox, an administrative law judge for the California Office of Administrative Hearings, ruled that Tesla was deceptive in its marketing of Autopilot and fully self-driving, ruling in favor of the California Department of Motor Vehicles, which had brought a complaint against the company.

The judge ordered that Tesla face a 30-day suspension of its sales and manufacturing licenses in the state.

However, California DMV Director Steve Gordon said at the time that his agency accepted the judge’s order with the revised fine.

Related: Tesla Proves It’s Really a Tech (Not Car) Company With the Latest Move

This story was originally published by The Street on March 4, 2026, where it first appeared in the Automotive section. Add TheStreet as a Favorite Source by clicking here.

Add Comment