Autopilot Warning May Not Help Tesla in Crash Defense
By: Rakteem Katakey, Margaret Cronin Fisk and Dana Hull (Insurance Journal) July 2016
Telling Tesla drivers its Autopilot feature doesn’t mean their cars can drive themselves may not be enough to keep Elon Musk off the hot seat if the technology comes up short.
This month, two Teslas equipped with Autopilot veered into barriers following disclosure of the first fatal wreck, a Model S slamming into a 18-wheeler crossing a Florida highway after the semi-autonomous car failed to distinguish the truck’s white trailer from sky.
Tesla Motors Inc. warns drivers they must still pay attention and be ready to grab back control of the car, but there’s a lot in a name.
“The moment I saw Tesla calling it Autopilot, I thought it was a bad move,” said Lynn Shumway, a lawyer who specializes product liability cases against carmakers. “Just by the name, aren’t you telling people not to pay attention?”
Joshua Brown’s death in Florida was the first involving Tesla’s semi-autonomous technology, triggering chatter in legal circles about who was liable for the crash and prompting a probe by the National Highway Traffic Safety Administration as well as the National Transportation Safety Board, which typically devotes its attention to mishaps involving planes and trains. Some details remain in dispute, including whether Brown, a former Navy SEAL, might have been watching a Harry Potter movie in a DVD player found in the car.
Musk had anticipated the moment for at least two years, telling drivers to keep their hands on the wheel because they will be accountable if the car’s on Autopilot crash. Tesla buyers must activate the Autopilot software, which requires them to acknowledge the technology is a beta platform and isn’t meant to be used as a substitute for the driver.
Driver’s Responsibility
When U.S. investigators began evaluating Brown’s crash, Tesla doubled down in a statement: “Autopilot is an assist feature. You need to maintain control and responsibility of your vehicle.”
But people will be people and they often don’t do what they’re supposed to do.
Lawyers compare giving Tesla drivers Autopilot to building a swimming pool without a fence; the property owner should know that neighborhood kids will find it hard to resist and may get hurt.
“There’s a concept in the legal profession called an attractive nuisance,” said Tab Turner, another lawyer specializing in auto-defect cases. “These devices are much that way right now. They’re all trying to sell them as a wave of the future, but putting in fine print, ‘Don’t do anything but monitor it.’ It’s a dangerous concept.”
As with so-called smart features before it such as anti-lock brakes and electronic stability control, telling drivers Autopilot might not prevent an accident won’t help Tesla in court if the technology is found to be defective, Turner said.
“Warnings alone are never the answer to a design problem,” he said.
Possible Arguments
In a court case, lawyers for accident victims or their families would have other lines of attack if Tesla blames accidents on drivers failing to heed warnings. They could assert that Tesla’s software is defective because it doesn’t do enough to make sure drivers are paying attention.
Attorneys could also argue that, in Brown’s case for example, the car should have recognized the tractor-trailer as an obstacle, or that Tesla could have easily updated its system to address such a foreseeable problem.
“Any argument will try to establish that Tesla acted in an unreasonable way that was a cause of the crash,” said Bryant Walker Smith, a University of South Carolina law professor who researches automation and connectivity. “It doesn’t even need to be the biggest cause, but just a cause.”
If Brown’s May 7 crash doesn’t end up in court, others might.
A 77-year-old driver from Michigan, which passed laws allowing semi-autonomous and fully autonomous vehicles, struck a concrete median in Pennsylvania and his 2016 Model X SUV rolled over. Also this month, a driver in Montana said his Tesla veered off the highway and into a guardrail. Both drivers said their cars were operating on Autopilot at the time and both were cited for careless driving.
Pennsylvania and Montana are among the 42 states without legislation regulating autonomous and semi-autonomous cars.
Musk Response
Musk fired back in a tweet, saying the onboard vehicle logs show the Autopilot was turned off in the Pennsylvania crash and that the accident wouldn’t have happened if it had been on. The company said the Montana driver hadn’t placed his hands on the wheel for more than two minutes while the car was on Autopilot.
Musk and Tesla are certain to argue that while their technology has yet to meet the threshold for “autonomous vehicle,” its Model S has achieved the best safety rating of any car ever tested.
Even with that record, Consumer Reports last Thursday called on Tesla to disable Autopilot on more than 70,000 vehicles. “By marketing their feature as ‘Autopilot,’ Tesla gives consumers a false sense of security,” said Laura MacCleery, vice president of consumer policy and mobilization for Consumer Reports.
“Tesla is consistently introducing enhancements proven over millions of miles of internal testing to ensure that drivers supported by Autopilot remain safer than those operating without assistance,” the carmaker said Thursday in a statement. “We will continue to develop, validate, and release those enhancements as the technology grows. While we appreciate well-meaning advice from any individual or group, we make our decisions on the basis of real-world data, not speculation by media.”
Khobi Brooklyn, a spokeswoman for the Palo Alto, California-based carmaker, cited the company’s earlier comments on the three accidents and declined to comment further on possible litigation involving Autopilot.
National Rules
The U.S. government will soon offer the auto industry guiding principles for safe operation of fully autonomous vehicles, part of a plan that includes $4 billion for safety research by 2026.
For now, the double line between Autopilot and full autonomy is a blurry one.
In 2013, NHTSA released a five-rung autonomous vehicle rating system based on cars’ computerized capabilities, ranging from level 0 for “no-automation” to level 4 for “full self-driving automation.”
Tesla’s likely to argue its technology has yet to surpass level 2: automation designed to relieve driver control of at least two functions. Plaintiffs will counter the car’s been marketed more like a level 3, when the driver can fully cede control of all safety-critical functions while remaining available for occasional intervention.
“It’s great technology, I hope they get this right and put people like us out of business,” says Steve Van Gaasbeck, an auto products lawyer in San Antonio, Texas. “There’s really no excuse for missing an 18-wheeler.”
Copyright 2016 Bloomberg.
Categories
- Benefits Resources
- Bonding
- BOP
- Business Insurance
- Commercial Auto
- Commercial Property
- Company News
- Construction
- Crime Insurance
- Cyber Insurance
- Directors & Officers
- Employee Benefits
- Employment Practice Liability Insurance
- Entertainment
- General Liability
- Health Insurance
- Healthcare
- Healthcare Reform
- Homeowners Insurance
- Hospitality
- Manufacturing
- Medical Malpractice
- Mining & Energy
- Nightclubs
- Personal Auto
- Personal Insurance
- Professional
- Restaurants
- Retail & Wholesale
- Risk Management Resources
- Safety Topics
- SBA Bonds
- Security
- Seminars
- Technology
- Tourism
- Transportation
- Uncategorized
- Workers Compensation
Archives
- May 2021
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- November 2018
- September 2018
- August 2018
- May 2018
- April 2018
- March 2018
- February 2018
- January 2018
- December 2017
- November 2017
- October 2017
- September 2017
- August 2017
- July 2017
- June 2017
- May 2017
- April 2017
- March 2017
- February 2017
- January 2017
- October 2016
- September 2016
- August 2016
- July 2016
- June 2016
- May 2016
- April 2016
- March 2016
- February 2016
- January 2016
- December 2015
- November 2015
- October 2015
- September 2015
- August 2015
- July 2015
- June 2015
- May 2015
- April 2015
- March 2015
- February 2015
- January 2015
- December 2014
- November 2014
- October 2014
- September 2014
- August 2014
- July 2014
- June 2014
- May 2014
- April 2014
- March 2014
- February 2014
- January 2014
- December 2013
- November 2013
- October 2013
- September 2013
- August 2013
- July 2013
- June 2013
- February 2013
- November 2011
- October 2011
- September 2011
- July 2011
- June 2011
- March 2011
- November 2010
- October 2010
- September 2010
- April 2010
- February 2010
- November 2009
- October 2009
- November 2008
- August 2008