In under a decade, Tesla has become a household name and a dream car for a new generation of kids, nearly on a par with Lamborghinis and Bugattis. Teslas are quick, stylish, and technologically advanced but the company’s clever, if often gimmicky, marketing has also played a large role in the company’s fame. Nearly all of these marketing ploys are harmless—cheekily naming the cars the Model S, 3, X, and Y or sticking flashy “Falcon Wing” doors on an SUV—but there is one instance where Tesla went too far: Autopilot.

“Autopilot” implies, especially for ordinary people who have little interest in the inner workings of technology, that the car can drive itself. But this is patently false—Teslas with Autopilot are at “Level 2” on SAE’s autonomy scale, meaning Autopilot is simply an advanced driver-assistance system, and not at “Level 4,” when vehicles are considered fully autonomous.

Ever since Autopilot became available in 2015, some Tesla owners have been misusing the advanced driver-assistance system. When a driver engages Autopilot, which allows Teslas to steer, accelerate and brake within their lanes without driver input, a message pops up on the dashboard telling drivers to “please keep your hands on the wheel” and to “be prepared to take over at any time.” While Tesla claims that drivers will receive warnings after 30 seconds of not holding the wheel, some videos posted by Tesla owners have shown that the warnings can take far longer to appear.

This has allowed some irresponsible Tesla owners to abuse the Autopilot system, and videos have been posted online showing Tesla drivers performing an array of ill-advised activities while Autopilot is in control, from taking a nap to filming a vlog and even sitting in the passenger seat and leaving the driver’s seat vacant.

While most of these Tesla owners get away with their reckless actions, some instances have led to crashes, and several have been fatal. On December 29th, there were two deadly crashes. In California, a Model S that had just exited a freeway ran a red light and plowed into a Honda, killing both Honda occupants and in Indiana a Tesla rear-ended a fire truck parked on the highway shoulder, killing the woman in the Tesla’s passenger seat. The National Highway Traffic Safety Administration has dispatched a team to investigate the California incident, and this team has already analyzed 13 crashes where the agency believes Teslas had Autopilot engaged.

Much of the blame for these deaths, and other non-fatal crashes, lies with the drivers, who should have been using Autopilot safely by paying attention and keeping their hands on the wheel. However, Tesla must accept some responsibility for the crashes caused by inattentive Autopilot users. The branding matters. Just like “S3XY” model names stir customer interest, misleading monikers confuse consumers—by contrast, there have been no reported crashes involving Cadillac’s less suggestively named “Super Cruise” system, which operates similarly to Autopilot.

This deceptive marketing extends to Tesla’s website, which touts the car’s ability “to assist you with the most burdensome parts of driving.” The Autopilot page devotes three paragraphs to “Full Self-Driving Capability,” explaining that “all new Tesla cars have the hardware needed in the future for full self-driving” before launching into a narration of how “you,” the reader and prospective Tesla owner, will be able to enjoy the fruits of full autonomy in your Tesla. Just one sentence on the whole site indicates Autopilot’s limitations, buried amongst tons of information that make it appear as if the autonomous future has already arrived.

Tesla of course did not intend for its customers to misuse Autopilot and for these tragic crashes to occur, but that does not excuse its marketing tactics, especially when the abuse of their technology can be the difference between life or death for Tesla passengers and everyone else on the road.

3 thoughts on “False Confidence: The Dangers of Autopilot

  1. Great read. At this time nothing will take the place of a responsible driver and a Tesla seems to be marketing an autopilot which is unproven.

    Like

  2. You correctly point out the enormous danger of Tesla using a marketing term which over-promises their car’s auto driving capabilities. This encourages Tesla drivers to be careless and not vigilant when they drive.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s