Is the Tesla Autopilot safe?

Written by: Anonymous

If you asked me this a year ago, I probably would have said, “I know nothing about cars–much less driving safety.” But recently, the carpooling schedule I’ve been thrown into has made me painfully aware of the dangers in using AI and computers to drive cars–especially at encouraging poor driving practices.


So how did I even get myself in this position? My average school day morning is built around a scheduled routine: wake up, brush my teeth, eat breakfast, pack my backpack, contemplate the number of tests waiting to destroy me, and hurry out the door to our neighbor’s garage. Our carpooling system works so that my mother picks us up afterschool while our neighbor’s parents drop us off in the mornings. I get into a polished Tesla that reeks of sweat and we embark on a long, mostly autopilot journey to CCA. That’s an exaggeration, but my neighbor’s driving sure is jerky, inconsistent, and he often speeds–thrusting us around the car. 


While I occasionally look out the front to check if we’re even in a lane, I ask myself the same question everyday: Am I safe? Probably not– in fact the Tesla’s screen occasionally glares red at least several times each day, telling him to “keep your hands on the steering wheel” in order to get him to pay attention. 


First and foremost, how does it even know he’s not paying attention? The Tesla’s autopilot system is composed of various cameras and sensors lined up around the outside and inside of the Tesla. These sensors detect any movement in the front, behind, and beside the vehicle; they map out the fastest route (ex. when to switch lanes to increase efficiency); and can detect whether or not the driver’s hands are one the steering wheel. 

Screen inside Tesla displaying surrounding cars and pedestrians. Data is collected by sensors on outside of car.

This all sounds pretty neat, right? Theoretically, yes, as it should just be an assistant to making driving safer for already attentive drivers. For example, on Tesla's website, they state that “Autopilot and Full Self-Driving (Supervised) features require active driver supervision and do not make the vehicle autonomous.” That would make sense, and if anything, should be a wonderful tool to help young and old drivers stay even safer on the road. But, on the other side of the conversation: does that encourage people to text while driving just because they feel safer letting go of the steering wheel? What’s to say that having autopilot won’t encourage someone who's intoxicated to drive? Therefore, these AI features and sensors can be easily misused by drivers. Tesla Autopilot serves as an invitation for people, like my carpool driver, to multitask while driving. Autopilot has also statistically led to an increase in collisions, crashes and injuries. According to an NBC News article, “Tesla’s Autopilot system contributed to at least 467 collisions, 13 resulting in fatalities and ‘many others’ resulting in serious injuries.” Tesla’s system clearly does not enforce the driver’s attention enough that they can be considered as exercising “active” supervision.

Furthermore, this system is solely based on honesty. The only requirement for drivers to use autopilot is “to agree to ‘keep your hands on the steering wheel at all times’ and to always ‘maintain control and responsibility for your vehicle’” (Tesla.com). This system is not at all designed to be autonomous, as the Tesla company itself states, but they’re certainly creating a problem where it is– and it's starting to become a safety issue. 

Finally, car drivers are not the only ones in danger. On April 19, 2024, a “Tesla driver in Snohomish County, Washington, struck and killed a motorcyclist … The driver told police he was using Autopilot at the time of the collision” (NBC News). Other drivers on the road are at risk of becoming victims due to the failure of the autopilot system. 

Even if it’s not meant to encourage texting, eating, or distractions in the car, the Tesla’s autopilot is raising serious concerns for safety. Is high end technology really enough to counteract unsafe driving? Or is it making these habits more common among auto drive users? Personally, the answer is clear for me when I sit in the back of that sweaty Tesla–closing my eyes every time the car releases an angry beeping sound that blends with the YouTube videos playing on the monitor. 




Citations:

“Tesla Autopilot Linked to Hundreds of Collisions, Has ‘critical Safety Gap,’ Federal Regulator Says.” NBCNews.Com, NBCUniversal News Group, 26 Apr. 2024, www.nbcnews.com/tech/tech-news/feds-say-tesla-autopilot-linked-hundreds-collisions-critical-safety-ga-rcna149512.

Autopilot and full self-driving (supervised): Tesla Support. Tesla. (n.d.). https://www.tesla.com/support/autopilot