Consumer Reports calls
on Tesla to disable, rename autopilot feature.
The Washington Post (7/14, Bogage, 9.18M) reports
that on Thursday, Consumer Reports called for “Tesla to disable its
semiautonomous autopilot mode in the wake of a May crash fatality in which
autopilot failed to alert a driver of an oncoming vehicle.” Consumer Reports
wrote in a blog post, “While the exact cause of the fatal
accident is not yet known, the incident has caused safety advocates, including Consumer
Reports, to question whether the name Autopilot, as well as the marketing hype
of its roll-out, promoted a dangerously premature assumption that the Model S
was capable of truly driving on its own.” According to the Washington Post, the
magazine “asked Tesla to disable autopilot’s ‘autosteer’ system, issue new
guidance to drivers about the system’s use, discontinue beta releases of
semiautonomous technology and rename the autopilot feature.”
Bloomberg News (7/14, Hull, 2.07M) reports
that the article called Tesla’s Autopilot “Too Much Autonomy Too Soon.” Vice
president of consumer policy and mobilization for Consumer Reports Laura MacCleery
said, “By marketing their feature as ‘Autopilot,’ Tesla gives consumers a false
sense of security.” The article continues, “In the long run, advanced active
safety technologies in vehicles could make our roads safer. But today, we’re
deeply concerned that consumers are being sold a pile of promises about
unproven technology. ‘Autopilot’ can’t actually drive the car, yet it allows
consumers to have their hands off the steering wheel for minutes at a time.
Tesla should disable automatic steering in its cars until it updates the
program to verify that the driver’s hands are on the wheel.” MacCleery appears
on CNBC’s Power Lunch (7/14, 282K) to discuss the
story further.
The Los Angeles Times (7/14, Peltz, 4.09M) reports
that Tesla “has emphasized that Autopilot is still in a beta phase of
introduction and has limitations” and has warned drivers “to stay alert and
keep their hands on the steering wheel because the technology does not provide
fully autonomous driving.” However, Consumer Reports says that “these two
messages – your vehicle can drive itself but you may need to take over the
controls at a moment’s notice – create potential for driver confusion.” The
magazine added, “It also increases the possibility that drivers using Autopilot
may not be engaged enough to react quickly to emergency situations.”
Business Insider (7/14, 3.06M) reports that
the consumer magazine also called on Tesla to “test all safety-critical systems
fully before public deployment; no more beta releases.”
Tesla, Musk decline to disable or rename system. The AP (7/14, Krisher, Durbin) reports that “a
Tesla spokeswoman said the company has no plans to change the name, and that
data it collects show drivers who use Autopilot are safer than those who
don’t.”
USA Today (7/14, Bomey, 6.31M) mentions that
Tesla’s comments come after the NHTSA and the NTSB announced investigations of
a fatal crash involving a Tesla Model S while in Autopilot mode. The article
adds that “Tesla CEO Elon Musk has refused to disable the system, which could
be done through an over-the-air software update, and has instead repeatedly
defended it and said it’s safer than human driving.” In a statement released
Thursday, the company said, “Tesla is constantly introducing enhancements
proven over millions of miles of internal testing to ensure that drivers
supported by Autopilot remain safer than those operating without assistance.”
NHTSA requests data on
Tesla’s autopilot technology.
The New York Times (7/12, Vlasic, Boudette,
Subscription Publication, 14.18M) reports on the front page of its business
section that Federal officials are stepping up “their investigation of the
fatal crash of a driver operating a Tesla car with its Autopilot system
engaged.” The National Highway Traffic Safety Administration (NHTSA) “on
Tuesday released a detailed set of questions for the carmaker about its
automated driving system, particularly the emergency braking function.” The nine-page letter the agency sent Tesla
indicated that the NHTSA “was investigating whether there are defects in the
various crash-prevention systems related to Autopilot.” The article mentions
that the NTSB, “which more typically investigates airline accidents,” is also
investigating the crash.
Business Insider (7/12, Debord, 3.06M)
specifies that the “NHTSA has asked Tesla to provide extensive information on
the crashed Tesla’s Forward Collision Warning (FCW) and Automatic Emergency
Braking (AEB) systems, as well as the Autosteer function that enables a Tesla vehicle
in Autopilot mode to navigate a roadway.”
The AP (7/12, Krisher) reports that the NHTSA is
seeking to determine why the Autopilot technology “failed to detect a
tractor-trailer that crossed in front of a Model S sedan May 7 in Williston,
Florida.” The investigators are “zeroing in on the limitations of the system
and how it reacts when obstacles cross its path.” The majority of the inquiry
focuses “on how the system works at intersections with crossing traffic, but it
also asks Tesla to describe how the system detects ‘compromised or degraded’
signals from cameras and other sensors and how such problems are communicated
to drivers.” The NHTSA “also asked Tesla for its reconstruction of the Brown
crash, and for details of all known crashes, consumer complaints and lawsuits
filed or settled because the Autopilot system didn’t brake as expected,” and
“said Tesla must comply with its request by Aug. 26 or face penalties of up to
$21,000 per day, to a maximum of $105 million.”
Writing an analysis for Seeking Alpha (7/12, 660K), Paulo Santos
writes that the “NHTSA’s probe into autopilot performance has widened,” and is
“asking not just information on this particular fatal autopilot accident, but
also on other accidents where autopilot might have been involved.” Santos says
that now the “NHTSA is trying to get a better grip on autopilot performance in
general.”
Bloomberg News (7/12, Shields, 2.07M) reports
the NHTSA “says it hasn’t made a determination about whether the vehicles are
defective and described the information request as a “standard step” in the
preliminary evaluation of Tesla’s automated driving system.”
USA Today (7/12, Bomey, 6.31M) reports that
“the safety of Tesla Motors’ partially self-driving car technology is the
subject of a National Transportation Safety Board investigation after a crash
that killed a driver in Florida who had activated the system in his vehicle.”
According to the article, “the NTSB has sent a team of investigators to open an
investigation into the crash.” USA Today says that the “NTSB’s investigation is
particularly notable because the organization’s car-crash probes typically
center on emerging technologies.” NTSB Spokesman Christopher O’Neil said in an
email, “the NTSB investigation will be looking more comprehensively at whether
the crash reveals systemic issues that might inform the future development of
driverless cars and the investigation of crashes involving autonomous
vehicles.”
MLive (MI) (7/12, 762K) reports that the
NTSB’s investigation has been launched alongside investigations by the NHTSA
and the SEC. While the NTSB’s investigation will focus on the broad issue of
semi-autonomous driving systems in the US, the NHTSA will look specifically at
Tesla’s crash-avoidance system and the SEC will investigate whether Tesla
failed to disclose information to investors.
Business Insider: Tesla should be most concerned with NTSB
investigation. In an analysis, Business Insider (7/12, DeBord, 3.06M) says
that “the real problem for Tesla is that the Florida accident is also being
investigated by the National Transportation Safety Board (NTSB).” BI says that
“it’s entirely possible that the NTSB will recommend that self-driving
technologies be far more rigorously tested and regulated, placing Tesla in the
position of having to disable Autopilot features or withdraw the system.” The
analysis concludes that “a damaging NTSB report could undermine Tesla’s
dominant narrative: that it’s the car maker of the future.” which is why the
NTSB investigation is the one “that Elon Musk should be most concerned about.”
Tesla has no plans to disable autopilot feature. The Wall Street Journal (7/12, Ramsey, Spector,
Bach, Subscription Publication, 6.27M) reports that Tesla Motors CEO Elon Musk
says the company has no plans to disable the Autopilot feature in its cars in
the wake of a fatal in May involving the Model S. Instead, Musk said the
company is planning to heighten efforts to educate customers on how the system
works and how to properly use it.
CNBC (7/12, Ferris, 2.45M) reports that Tesla
Motors “sees Autopilot as a ‘lifesaving technology’ according to Dow Jones, and
said it will ‘redouble its efforts to educate customers.’”
Reuters (7/12, Sadam) reports that Tesla “is
planning an explanatory blog post to educate customers.” Musk said in an
interview, “A lot of people don’t understand what it is and how you turn it
on.”
Nissan Leaf, Sentra
recalled over airbags.
Edmunds (7/13, Lienert, 354K) reports that
Nissan North America is recalling 4,355 2016 Nissan Leaf and Sentra cars over
an airbag problem. “The wiring harness connector may disconnect from the
dual-stage passenger airbag,” the NHTSA said in its recall summary. “If the
wiring harness disconnects, the passenger airbag may not deploy during a crash,
increasing the risk of injury.” Automotive Fleet (7/13, 62K) adds that the
recall covers vehicles manufactured between February and March of this year.
Dealerships will inspect the wiring harness and repair if necessary at no cost.
Additional coverage is available from Cars (7/13, 876K) and Leftlane News (7/13, 1K).
Third
Tesla accident blamed on autopilot. USA Today (7/11, Gardner, Bomey, 6.31M)
reports that “for the third time in two weeks, a Tesla electric vehicle has
crashed with the driver telling authorities that the car’s Autopilot
self-driving system was engaged at the time.” Tesla “said that it is looking
into the crash and could not confirm whether Autopilot was a factor.” According
to the Montana Highway Patrol, the driver “said he activated the car’s
Autopilot driver assist system at the beginning of the trip.”
Digital Trends (7/11, Glon, 354K) reports that
the “Tesla owner is blaming” the accident “on the company’s semi-autonomous
Autopilot technology.” A message posted on the Tesla Motors Club forum
“explains that his friend was traveling in a Model X at about 60 mph in a
55-mph zone with Autopilot turned on when the crossover veered off the road and
hit a wooden guardrail.” The article mentions that the NHTSA has not yet
commented on this crash as it is already investigating two other accidents
involving Tesla cars.
CNET News (7/11, Musil, 609K) reports that the
autopilot feature “failed to detect an obstacle in the road.” Tesla has yet to
comment on the accident. Also reporting on the story are Motor Trend (7/11, Pleskot, 7.17M), BGR (7/11, Epstein, 223K), the Detroit Free Press (7/11, Gardner, 1.02M), Road and Track (7/11, Silvestro, 3.55M), Fusion (7/11, 413K), Jalopnik (7/11, 633K), AutoGuide (7/11, 34K), Autoevolution (7/11, 5K), Hot Hardware (7/11, 1K), and Gas 2.0 (7/11, 3K).
NHTSA probing Tesla’s
autopilot mode after two crashes.
ABC News (7/7, Perlow, 4.15M) reports the
National Highway Traffic Safety Administration has launched a preliminary
investigation into Tesla’s automated system following the death of Joshua
Brown, who was driving his Tesla Model S in autopilot mode when both the car
and driver failed to notice a tractor-trailer crossing two lanes of traffic in
an intersection, careening the Tesla underneath the trailer. Prior to the
crash, Telsa CEO Elon Musk told Bloomberg, “We’re going to be quite clear with
customers that the responsibility remains with the driver,” adding, “we’re not
asserting that the car is capable of driving in the absence of driver
oversight.” The company is calling Brown’s death a “tragic loss.”
The Pittsburgh Post-Gazette (7/7, Moore, 533K)
reports another investigation is underway for second crash involving Tesla’s
autopilot mode. The accident occurred July 1 on the Pennsylvania Turnpike when
a man driving his Tesla Model X in self-driving mode rolled the SUV after crashing
into barriers on both sides of the highway; both the driver and passenger
survived the crash. This investigation comes on the heels of the May 7 Florida
crash that killed Joshua Brown. “Over-reliance creates more risks in using this
technology,” said David L. Strickland, a former NHTSA administrator who is
leading the newly formed Self-Driving Coalition for Safer Streets. This group
includes Google, Uber, Lyft, Ford, and Volvo, who are all pushing for favorable
rules for the technology ahead of the NHTSA’s expected release of undated
guidelines for self-driving cars. Mr. Strickland also said, “My member
companies and every automaker that’s working on full self-driving technology is
absolutely, positively working hard to ensure that when this technology is
placed in the hands of consumers, that it is going to operate at the highest
level of safety.”
NHTSA investigates
second Tesla vehicle crash.
The Wall Street Journal (7/6, Spector, Ramsey,
Subscription Publication, 6.27M) reports days after launching a formal probe of
Tesla Motors’ Autopilot system linked to a fatality in Florida, the NHTSA is
examining a second collision in Pennsylvania. Tesla said it doesn’t have evidence
Autopilot was in use at the time of the crash, and the Tesla SUV driver
declined to comment Tuesday.
CNBC (7/6, 2.45M) reports that the second
crash involved a Tesla Model X SUV that was “reportedly in autopilot mode when
it rolled onto its roof on the Pennsylvania Turnpike.” The NHTSA said it is
currently collecting information from Tesla, the Pennsylvania State Police, and
the vehicle’s driver “to determine whether automated functions were in use at
the time of the crash.” The article notes that initially, Tesla’s spokesperson
said the company had “no reason to believe that Autopilot had anything to do
with this accident,” but later, the statement was revised to say “no data at
this point to indicate that Autopilot was engaged or not engaged.”
Reuters (7/6) reports that according to
Pennsylvania State Police, “the Model X struck a turnpike guard rail, then
veered across several traffic lanes and into the median, where it landed on its
roof in the middle of the roadway.” The driver and passenger in the car were
uninjured.
The Detroit Free Press (7/6, Gardner, 1.02M)
reports that Tesla said in a statement, “We received an automated alert from
this vehicle on July 1 indicating air bag deployment, but logs containing
detailed information on the state of the vehicle controls at the time of the
collision were never received.” Tesla added, “This is consistent with damage of
the severity reported in the press, which can cause the antenna to fail.”
Second accident occurs
involving Tesla in autopilot mode.
The Detroit Free Press (7/5, Gardner, 1.02M)
reports Michigan art gallery owner Albert Scaglione and his son-in-law, Tim
Yanke, survived a Friday accident in which their 2016 Tesla Model X crashed
into a guard rail and concrete median and rolled over while in Autopilot mode.
The incident occurred just two days after the NHTSA announced it launched an
investigation into a May collision involving a Model S in Autopilot that killed
the passenger.
The Huffington Post (7/5, Mclaughlin, 367K)
reports Tesla credited the May collision to “extremely rare circumstances”
where neither the vehicle sensors nor the passenger applied breaks.
NHTSA continues to investigate fatal Tesla crash. USA Today (7/5, Woodyard, 6.31M) reports that
the recent fatal crash of Tesla Model S driver has raised concerns about the
need for stronger federal regulation on self-driving technology. President of
Consumers for Auto Reliability and Safety Rosemary Shahan argued that the Model
S “wasn’t ready to go out on the road.” Shahan added, “If you have a system
called Autopilot that cannot distinguish between the side of a truck and the
open sky, it’s not ready.” The article mentions that the NHTSA is developing
guidelines on the development of self-driving cars. At the same time, the NHTSA
is also “investigating whether Brown might have been distracted while his 2015
Tesla Model S was in Autopilot mode as a truck crossed his path.” NHTSA issued
a statement last week that it “will examine the design and performance of the
automated driving systems in use at the time of the crash.” Spokesman Bryan
Thomas “declined to elaborate” further on the investigation on Tuesday.
Tesla notified regulators about autopilot crash nine days after
incident. Reuters (7/5, Sage, Lienert) reports Tesla
Motors alerted regulators to a fatality in one of its Model S sedans in partial
self-driving Autopilot mode nine days after it crashed in Florida, the company
said on Tuesday. On Tuesday, CEO Elon Musk tweeted in response to an article by
Fortune magazine about the timing of the disclosure that the May 7 fatality
“wasn’t material” to Tesla. The company was obligated to disclose the fatality
to regulators during its third quarter but notified them earlier, on May 16, as
it was investigating. “Tesla then provided NHTSA with additional details about
the accident over the following weeks as it worked to complete its
investigation, which it ultimately concluded during the last week of May,” a
Tesla spokeswoman said.