TX - Driverless TESLA crashes and burns, 2 passengers die, Houston, Apr 2021

CocoChanel

Retired WS Staff
Websleuths Guardian
Joined
Aug 22, 2009
Messages
7,366
Reaction score
33,445
  • #1
  • #2
This horrible accident happened in a suburban upscale area of Houston late Friday, April 16.
The fire took 4 hours to extinguish, which is unbelievable.
Victims have not been IDed yet, but are reportedly 2 males, ages 59 and 69.
Reports are that the Tesla was traveling at a high rate of speed and did not successfully make a turn.

‘No one was driving the car’: 2 men dead after fiery Tesla crash in Spring, officials say

“The owner, he said, backed out of the driveway, and then may have hopped in the back seat only to crash a few hundred yards down the road. He said the owner was found in the back seat upright.

“The brother-in-law of one of the victims said relatives watched the car burn for four hours as authorities tried to tap out the flames.

Authorities said they used 32,000 gallons of water to extinguish the flames because the vehicle’s batteries kept reigniting. At one point, Herman said, deputies had to call Tesla to ask them how to put out the fire in the battery.”


They had to let the fire burn out....happened in the neighborhood. :(
I think it’s The Woodlands area, not Spring.
 
Last edited:
  • #3
Good Golly - SMH - Horrible. Those poor men, and those poor fire-fighters. That car...
Driver-less car -- fuggetaboudit. No way, no how, but that's just me. I still miss my four-in-the-floor.
We'll see what happens. Lawsuits -- Wrongful Death perhaps. $$$$

Thanks for the thread, @CocoChanel. Still SMH.
 
  • #4
  • #5
I dislike fearmongering about automatic cars... ultimately I think they're going to be far safer than humans, and large road tolls* will be a thing of the past.
But at this stage I'm pretty sure it's still the law that someone has to be in the driver's seat, yeah? You can't blame the Tesla if they were skylarking and treating the car like a joke.

*Just in case calling the number of deaths caused by accidents isn't called a 'road toll' elsewhere, I just want to be clear that's what I mean and not the tolls you pay for using a paid motorway or bridge or whatever
 
  • #6
Cause? Did an autopilot malfunction set this tragedy in motion? Or owner's reliance on autopilot system? Or?

Just gen info re
Tesla, Fires and Autopilot crashes
https://en.wikipedia.org/wiki/Tesla,_Inc.Fires and Autopilot crashes
Tesla Autopilot - Wikipedia

Tesla is not the only vehicle manu. w issues w elec. fires.
Some are post-crash fires, like this one. (Other while vehicles are on the road, some while batteries are charging).
Plug-in electric vehicle fire incidents - Wikipedia

I wonder -
if fire dept.s can develop methods, materials, equipment to suppress these fires so they do not burn so long.
if Tesla & other manu's can or will do the same, maybe supply special extinguishers s the cars.
 
  • #7
That would be a nightmare if a Tesla crashed into a home or restaurant, and the battery kept reigniting for hours...:eek:
 
  • #8
I dislike fearmongering about automatic cars... ultimately I think they're going to be far safer than humans, and large road tolls* will be a thing of the past.
But at this stage I'm pretty sure it's still the law that someone has to be in the driver's seat, yeah? You can't blame the Tesla if they were skylarking and treating the car like a joke.

*Just in case calling the number of deaths caused by accidents isn't called a 'road toll' elsewhere, I just want to be clear that's what I mean and not the tolls you pay for using a paid motorway or bridge or whatever

I am not sure they will ever be far safer than humans. Every computer operated thing that I know of has many glitches and problems. People sitting by and not paying attention, while being driven at high speed in traffic, could be sitting ducks if a glitch occurred. JMO
 
  • #9
I am not sure they will ever be far safer than humans. Every computer operated thing that I know of has many glitches and problems. People sitting by and not paying attention, while being driven at high speed in traffic, could be sitting ducks if a glitch occurred. JMO

I might be biased, our family income comes from working on automation technology haha. I trust automation!
 
  • #10
What caused it to accelerate at such a high speed in that short of a time period from the owner’s driveway?
 
  • #11
‘No one was driving the car’: 2 men dead after fiery Tesla crash near The Woodlands, officials say

“On Monday night, Memorial Hermann issued the following statement regarding the loss of Dr. Varner in this accident.

<i>“Dr. Varner was a tremendous human being who personally impacted many throughout our Memorial Hermann The Woodlands Medical Center family over the years. Our thoughts and prayers go out to his entire family, and also to those who had the privilege of working and serving alongside him in various capacities. He will be dearly missed by so many.”</i>

Justin Kendrick, SVP & CEO, Memorial Hermann The Woodlands Medical Center”

Tesla crash victim is named as 59-year-old doctor who was driving with a friend | Daily Mail Online
 
  • #12
Texas police to demand Tesla crash data as Musk denies Autopilot use | Reuters

Texas police will serve search warrants on Tesla Inc (TSLA.O)on Tuesday to secure data from a fatal vehicle crash, a senior officer told Reuters on Monday, after CEO Elon Musk said company checks showed the car’s Autopilot driver assistance system was not engaged.

Mark Herman, Harris County Constable Precinct 4, said evidence including witness statements clearly indicated there was nobody in the driver's seat of the Model S when it crashed into a tree, killing two people, on Saturday night.

Herman said a tweet by Musk on Monday afternoon, saying that data logs retrieved by the company so far ruled out the use of the Autopilot system, was the first officials had heard from the company.

I am not well-versed in the way of corporate culture, but it surprises me to hear there had not been any communication between LE and Tesla corporate in the aftermath of this deadly accident before the tweet was sent out to the public.
 
  • #13
Automation Technology?
I might be biased, our family income comes from working on automation technology haha. I trust automation!
@Eloise Just jumping off your post and speaking generally re automation technology in various fields, and not to you or your fam co, not to Tesla specifically. Not blaming the two men in the car who died a terrifying death, regardless of details re cause.

Despite many, many, many automation technology improvements, there's always one obstacle to making systems idiot-proof: the idiots. Say a car's safety feature prevents operation of certain auto-drive technology by not operating unless driver's hands are on wheel. The owner's manual gives details; a warning displays on the dashboard. The idiot takes steps to defeat the safety feature by placing a water bottle, an orange or a weighted bangle on steering wheel, to simulate driver's hands - as shown in Daily Mail pix* and this 3 y/o youtube vid.**

Regardless of what caused this tragic crash and despite any upcoming tech. improvements to address it, imo difficult or impossible to overcome ppl deliberately defeating safety features.
_____________________________________
* Tesla crash victim is named as 59-year-old doctor who was driving with a friend | Daily Mail Online
** https://www.youtube.com/channel/UC9_VYZytaimsaNKPsPTuL-g/videos
 
Last edited:
  • #14
Oh that's terrible. I live nearby the area, The Woodlands is very nice & well off. (I'm too poor to own a blade of grass there!:p) Someone maybe did not follow the owner's manual about operating with autopilot, imo. Also, it terrifies me that they had to call Tesla because the batteries kept reigniting. Seems like a major safety issue that needs to be fixed. You'd think a fancy & very expensive car like that would be full of safeguards.
 
  • #15
Harsh criticism of Elon Musk...
Scrutiny of fiery Tesla crash that killed 2 in The Woodlands a sign that regulation may be coming

At issue is whether Musk has over-sold the capability of his systems by using the name Autopilot or telling customers that "Full Self-Driving" will be available this year.

"Elon's been totally irresponsible," said Alain Kornhauser, faculty chair of autonomous vehicle engineering at Princeton University. Musk, he said, has sold the dream that the cars can drive themselves even though in the fine print Tesla says they're not ready. "It's not a game. This is serious stuff."
 
  • #16

Regulation should have been there from the start. I'm shocked he was allowed to sell these cars with little or no testing, particularly improvised, peer reviewed, professional testing. Shame on regulators for not doing their jobs. NTSB seems to be champing at the bit to regulate this stuff. This is their wheelhouse and they should have been in charge of this from the start.

This article is from last year, February 2020

NTSB Chairman Bashes Tesla in Scathing Statement

"You can't buy a self-driving car today; we're not there yet," Sumwalt said, before noting how Tesla's Autopilot—which was being used when Huang's Model X crashed—is merely a Level 2 autonomous driving system that requires driver supervision at all times. The National Highway Traffic Safety Administration (NHTSA) outlines six "levels" of automated driving, with Level 4 vehicles being capable of full automation in certain driving instances and those capable of full automation in all instances taking on the top Level 5 designation. As Sumwalt pointed out in his opening statement, no car currently on sale in the United States—Teslas included—is available with Level 4 or 5 tech onboard. "But," Sumwalt added pointedly, "the driver in this crash, like too many others before him, was using Level 2 automation as if it were full automation.
 
  • #17
<modsnipped - quoted post was removed for victim blaming>

I'm not interested in being in a car that has NO driver unless it is on some kind of track like a monorail! Otherwise, I'll just stick with a human driver to get me from Point A to Point B!
 
Last edited by a moderator:
  • #18
  • #19
  • #20
I just can’t believe that any amount of intelligent programming can ever match the brain for ability to assess risk and interpret or anticipate human behaviour. Humanity itself is probably at risk if they ever do figure out how to create programmes that can! What a horrible accident.
 

Guardians Monthly Goal

Staff online

Members online

Online statistics

Members online
141
Guests online
1,839
Total visitors
1,980

Forum statistics

Threads
636,597
Messages
18,700,226
Members
243,770
Latest member
teamzilinsky
Back
Top