Helping Make our Communities Safer. Jaime is a Trial Attorney and Safety Advocate at Jaime Jackson Law in Lancaster, PA representing seriously injured victims, wrongful death and those harmed by unsafe products and corporate neglect. Contact Jaime at 717-519-7254 or email jaime@jaimejacksonlaw.com.
Tuesday, August 28, 2018
Man claims Tesla was in
auto-pilot mode when it crashed into fire truck.
The San Jose (CA) Mercury News (8/25, Sanchez,
552K) reports Michael Tran, the 37 year-old driver of a Tesla who crashed his
vehicle into a fire truck, said, “I think I had auto-pilot on” at the time of
the accident early Saturday morning in San Jose. A Tesla spokesperson said the
company “has not yet received any data from the car, but we are working to
establish the facts of the incident.” The New York Post (8/27, Press, 4.46M) reports
similarly
Fiat Chrysler recalls
205K new SUVs, vans in US, Canada, Mexico due to brake issue.
MLive (MI) (8/27, Raven, 983K) reports that
Fiat Chrysler “says it is recalling more than 150,000 newer SUVs and minivans
in the United States and another 55,000 in Canada and Mexico due to a brake issue.”
The automaker “says in a news release the recall affects the 2018 Dodge
Journey, 2018-2019 Dodge Grand Caravan, 2018-2019 Jeep Compass and 2019 Jeep
Cherokee.” The automaker reports in the news release, “An investigation by FCA
US discovered certain shipments of a supplied brake-system component had not
been manufactured to specification and were inadvertently installed on vehicles
during spring 2018.”
Friday, August 17, 2018
Autonomous car advocates
want pedestrians to adhere to traffic laws.
Bloomberg News (8/16, Kahn, 4.46M) reports
that some autonomous car advocates believe the large-scale adaption of
self-driving vehicles could be sped up if pedestrians can be convinced “to
behave less erratically,” such as avoiding jaywalking and crossing streets at
designated crossings where autonomous vehicles will be more likely to detect
the person. The piece mentions that the US Department of Transportation’s
latest guidance on automated vehicles “has stressed the need for such consumer
education.” Bloomberg says that the “novelty” of autonomous vehicles can lead
pedestrians to “test the technology’s artificial reflexes,” noting that Waymo
vehicles “routinely encounters pedestrians who deliberately try to ‘prank’ its
cars, continually stepping in front of them, moving away and then stepping back
in front of them, to impede their progress.”
Monday, August 13, 2018
Mazda, Suzuki and Yamaha
admit to using falsified emissions data.
The Hill (8/9, Keller, 2.71M) reports
officials in Japan yesterday announced “that Suzuki Motor Corp., Mazda Motor
Corp., and Yamaha Motor Co. have admitted to using falsified emissions data in
vehicle inspections, according to multiple reports.” The Hill adds that “the
three admissions came in the midst of an internal investigation ordered by the
government, the Associated Press reported.” According to a report from Reuters,
“Suzuki most often inspected vehicles with manipulated emissions data,” and the
report added that “the company confirmed that almost half of its 12,819 new car
inspections were improper dating back to 2012.” The Hill adds that “none of the
automakers reportedly found problems in their vehicles’ correct emissions and
fuel economy performance that warranted a recall.”
Wednesday, August 8, 2018
IIHS report raises
safety concerns about vehicles with automated assist features.
NBC
Nightly News (8/7, story 6, 1:45, Holt, 7.51M) reports that a new report
released Tuesday is “raising safety concerns about vehicles with automated
assist features, including the autopilot function already available on some
newer models.” The new Insurance Institute For Highway Safety (IIHS) report
“warned that electronic driver assistance systems may not see stopped vehicles,
and may lead you into a crash if you’re not careful.” Of the five models
tested, “the group found two of the Teslas, Model S and Model Three, hit a
stationary balloon when they had adaptive cruise control on.”
The AP (8/7, Krisher) reports that IIHS “said on
the road, the institute’s engineers found that all the vehicles but Tesla’s
Model 3 failed to respond to stopped vehicles ahead of them.” IIHS’ Chief
Research Officer David Zuby said, “We have found situations where the vehicles
under semi-automated control may do things that can put you and your passengers
at risk, and so you really need to be on top of it to prevent that from
happening.” Zuby “said IIHS is developing ratings for driver assist systems and
eventually will make recommendations on regulations for fully autonomous
vehicles”
The Hill (8/7, Keller, 2.71M) reports that
“researchers expressed caution about the viability of testing self-driving
vehicles on real roads, pointing to the incident last March when a self-driving
Uber prototype hit and killed a pedestrian.” The report said, “The Uber crash
in Arizona that took the life of a pedestrian in March shows the hazards of
beta testing self-driving vehicles on public roads.”
Citing the IIHS report, Bloomberg News (8/7, Gardner, 4.46M) says that
“the Uber Technologies Inc. self-driving test vehicle that killed a pedestrian
in Arizona earlier this year may have been able to avoid the crash had the
ride-hailing company not disabled Volvo Cars’ safety system.” The report
“criticizes Uber for turning off Volvo’s collision-avoidance technology in the
XC90 sport utility vehicle that struck and killed a woman in Tempe on March
18.”
The story was reported similarly by NBC News (8/7, Eisenstein, 5.76M), CBS News (8/7, Van Cleave, 6.78M), Digital Trends (8/7, Edelstein, 472K), and Business Insider (8/7, Ma, 5.65M)
Subscribe to:
Posts (Atom)