Morons Leave Driver Seat Empty, Let Tesla Autopilot Drive So They Can Keep Drinking In the Car
...and of course they posted the evidence on TikTok.
I hate to break it to the lowest common denominator of TikTok, but drunk-driving and open container laws still apply even if Autopilot takes the wheel...with no one behind the wheel. Thanks for posting the evidence on the internet, though!
That's bad enough, but they one-upped their own dumb behavior by leaving the driver seat empty as the car went 65 mph down the highway. Yikes.
We can't embed it on our site, but you can view the full video on TMZ here, which has the bonus of not giving the original evidence video the attention-clicks. That's what this is to The Man, you know: evidence.
Other TikTokers thankfully know better and are giving this the extreme side-eye it clearly deserves thanks to the app's split-screen "duet" feature. A personal injury lawyer, Attorney Tom, chimed in to say that if you get in an accident while using Autopilot, you will be sued anyway. (Tom's stare pretty much said it all.) Some joked that the dudes roped in a recently deceased celebrity like Kobe Bryant or Juice Wrld to drive for them. Others just posted the obvious: a photo of a cop staring at the screen, and a photo of an article about a guy who got arrested for drunk driving after falling asleep with Autopilot on.
"Natural selection," commented Billy Mueller on another TikTok roasting the drunk-ghost-ride-the-Tesla vid. "This is how God makes us smarter as a species. People drink White Claw and let Elon take the wheel."
Others on TikTok simply wondered who would get the ticket if they got pulled over, but the answer is clearly "everybody," thanks to laws in the U.S. against open containers of alcohol in the car.
This isn't the first time blurrblake has posted reckless behavior with the Tesla on his TikTok account, which now appears to have changed its name to blurr.tv. He has another video up showing a teddy bear behind the wheel with a dude reclining in the front passenger seat. "#viral!"
Someone's mom to take Blake's toys away for a while.
You shouldn't need to have other TikTokers say this because Tesla said it themselves: Autopilot—despite its name that seems to imply otherwise—is meant as a driver assist more than anything and requires a person behind the wheel while it's in use who can take over in case Autopilot malfunctions or makes a mistake.
That's not just a warning that applies to Autopilot. There are no actual self-driving systems on the road. They all require you to pay attention, and you will get in big legal trouble if you get caught with no one behind the wheel, much less an empty seat and a bunch of open hard seltzers. (Sheesh, both their judgement AND their taste in booze is terrible.)
Unfortunately for the responsible drivers out there, this kind of reckless behavior puts everyone you share the road with at danger, too. There have been many other crashes already where drivers put too much faith in Autopilot's driving skills. Please, just call someone sober—as in, an actual, live person who isn't Kobe, Jesus or Elon's army of programmers—to drive you home.
Got a tip? Send us a note: firstname.lastname@example.org
- RELATEDTesla Driver Watching Movie on Autopilot Crashes Into Cop Cars: PoliceDear Tesla owners: Don't be this guy.READ NOW
- RELATEDAutopilot Blamed for Tesla's Crash Into Overturned TruckKeep your hands on the wheel and your eyes on the road.READ NOW
- RELATEDAnother Tesla Crashes Into a Cop Car While on AutopilotIt seems "capable of driving itself" applies to neither the car nor its driver.READ NOW
- RELATEDTesla Model 3 Driver Says Autopilot Was Engaged During Crash Into Highway Patrol CarThe EV driver was reportedly checking on his dog in the backseat when he struck the police cruiser.READ NOW
- RELATEDNTSB Blames Smartphone Use, Tesla's 'Completely Inadequate' Tech in 2018 Model X Autopilot CrashThe NTSB called on Tesla to improve its driver monitoring years ago. It never responded—and a driver was killed.READ NOW