Tesla is blaming last month's fatal Tesla Model X car crash largely on the driver, not the car itself, according to a statement released by Tesla this week.
On March 23, 38-year-old Walter Huang, an Apple engineer, was driving to work with the car on Autopilot, according to a post on Tesla's blog. But then tragedy struck: At 9:27 a.m. local time, the car slammed into an unshielded highway median on U.S. Highway 101 near Mountain View, California, killing Huang and causing an inferno that closed the major highway for hours.
Although Tesla offered its condolences to Huang's family, it didn't mince its words when discussing the crash's cause. The only way the accident could have happened is if Huang "was not paying attention to the road, despite the car providing multiple warnings to do so," according to a statement Tesla sent April 10 to Dan Noyes, an investigative reporter with California's ABC7 News. [Self-Driving Cars: 5 Problems that Need Solutions]
However, a lawyer for Huang's family said that by blaming the driver, Tesla is sidestepping concerns Huang's family has about the car's Autopilot capabilities, according to ABC7 News.
Huang's brother, Will Huang, meanwhile says that Walter Huang was a safe driver who always had his hands on the wheel, according to ABC7. And Walter Huang's wife, Sevonne Huang, said that Walter had complained about the Tesla Autopilot steering toward the same barrier on several occasions, ABC7 said.
The statement didn't address the previous alleged Autopilot mishaps, but in a March 30 blog post, Tesla wrote that "the driver's hands were not detected on the wheel for 6 seconds prior to the collision," and that Huang had 5 seconds and about 492 feet (150 meters) of "unobstructed view of the concrete divider … but the vehicle logs show that no action was taken."
However, the company may be sending mixed messages about where a driver's hands can go. The Tesla Autopilot website states that "Every driver is responsible for remaining alert and active when using Autopilot and must be prepared to take action at any time." However, a video on the website shows a driver with his hands on his knees, not the wheel, which Patrick Traughber, a product manager at Twitter, pointed out in a tweet.
The statement also does not address what may be a more fundamental challenge for Autopilot technology: Human nature. Research shows it can be challenging for drivers to maintain vigilance while a vehicle is on Autopilot. For instance, during a flight-simulation exercise, participants were more likely to identify an automation failure in the first 10 minutes of a 30-minute exercise than in the last 10 minutes of it, indicating that the longer a system performs without error, the less vigilant people become, Undark reported.
Tesla also noted that the crash was so severe because the crash attenuator — the safety barrier on the highway that is designed to reduce impact — was damaged from a previous accident and hadn't been replaced yet.
Meanwhile, the National Transportation Safety Board is unhappy that Tesla is discussing details of the crash, which is still under investigation, according to The Washington Post.
Here is the entire statement from Tesla.
"We are very sorry for the family's loss.
According to the family, Mr. Huang was well aware that Autopilot was not perfect and, specifically, he told them it was not reliable in that exact location, yet he nonetheless engaged Autopilot at that location. The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so.
The fundamental premise of both moral and legal liability is a broken promise, and there was none here. Tesla is extremely clear that Autopilot requires the driver to be alert and have hands on the wheel. This reminder is made every single time Autopilot is engaged. If the system detects that hands are not on, it provides visual and auditory alerts. This happened several times on Mr. Huang's drive that day.
We empathize with Mr. Huang's family, who are understandably facing loss and grief, but the false impression that Autopilot is unsafe will cause harm to others on the road. NHTSA found that even the early version of Tesla Autopilot resulted in 40 percent fewer crashes and it has improved substantially since then. The reason that other families are not on TV is because their loved ones are still alive."
Original article on Live Science.
Live Science newsletter
Stay up to date on the latest science news by signing up for our Essentials newsletter.
Laura is the archaeology and Life's Little Mysteries editor at Live Science. She also reports on general science, including paleontology. Her work has appeared in The New York Times, Scholastic, Popular Science and Spectrum, a site on autism research. She has won multiple awards from the Society of Professional Journalists and the Washington Newspaper Publishers Association for her reporting at a weekly newspaper near Seattle. Laura holds a bachelor's degree in English literature and psychology from Washington University in St. Louis and a master's degree in science writing from NYU.