Yet another Tesla on Autopilot has crashed at full-speed into the back of yet another first responder's vehicle. Tesla cars seem to have an affinity for finding and smashing into police cars and fire trucks stopped doing their work on our roadways. Why can't Tesla, who claims to be about to release a "full-self-driving" version of its driver-assist technology, make it stop?
The Latest Tesla On Autopilot Crash
The latest crash comes this week. Police stopped to assist a stranded motorist on the highway. Not only did police have their emergency lights flashing, but they also had used road flares to signal to alert motorists than they should slow down and proceed with caution.
Related Story From 2018: Another Tesla On Autopilot Hits Another PoliceVehicle - You Can't Make This Stuff Up
Police say that an inattentive driver in a Tesla Model 3 using Autopilot, Tesla's "self-driving" driver-assist system then slammed into their patrol car. And the Tesla then kept on driving. Connecticut State Police report that after the crash, "The operator of the Tesla continued to slowly travel northbound before being stopped several hundred feet ahead by the second Trooper on scene."
Police say that the driver admitted to using the Autopilot system at the time of the crash and that the driver was not paying attention to the road, but relying on the Tesla system to keep himself, police, and other motorists safe.
Tesla History Of Autopilot Crashes
Tesla's Autopilot has been in control of Tesla automobiles during numerous crashes, with multiple fatalities. Tesla claims that drivers are supposed to remain in control with hands on the wheel at all times. However, Elon Musk himself has been videotaped not doing so, as has his (former) wife.
In addition to the fatal crashes where Model S and Model X cars have crashed on Autopilot killing the occupants of the Tesla vehicles, Tesla vehicles on Autopilot have crashed into the back of numerous fire trucks and at least two police vehicles.
Another Lucky First Responder
Tesla's Autopilot system has yet to kill an officer, AMT, or firefighter on duty, but it seems like it is just a matter of time with so many past crashes of this type. Connecticut State Police say with regard to this latest crash, "Fortunately, no one involved was seriously injured, but it is apparent that this incident could have been more severe."
The occupant in the Tesla operating on Autopilot was issued a misdemeanor summons for Reckless Driving and Reckless Endangerment according to police.
Torque News has reported on Tesla Autopilot crashes and deaths since 2016. Here is a list of related stories:
- (August 2018) Third Tesla Crashes Into Back of Firetruck - That's Four Crashes Into Emergency Vehicles This Year
- (January) Tesla Police Blotter News - Tesla Driver Hits Parked Firetruck - Blames Autopilot
- Second Tesla Model S Slams Into the Back Of Firetruck - Occupant Says Car Was In Autopilot Mode
- Report: Model X In Latest Tesla Autopilot Death Accelerated And Steered Into Barrier
- Another Tesla On Autopilot Hits Another Emergency Vehicle - You Can't Make This Stuff Up
- "Stop This Nonsense": Tesla's Autopilot and Its Problems With Stationary Object Detection
- NTSB Report Eerily Predicts Tesla Model X Autopilot Fatality 6 Months In Advance
- Tesla Tries To Address Autopilot's Fatal Flaw Ahead of Full Self Driving Update
You can view additional images and read additional facts about this latest Autopilot crash at the CT State Police Facebook Page.
In addition to covering green vehicle topics, John Goreham covers safety, technology, and new vehicle news at Torque News. You can follow John on Twitter at @johngoreham.
Comments
What an ignorant article. So
Permalink
What an ignorant article. So if a guy using cruise control 20 years ago runs into the back of someone, is the cruise control at fault? Or the driver?
Tesla has told AP users over and over that they must be ready to take over at any time. The gut admitted he wasn't paying attention. Only an idiot takes his eye off the road for that long using technology this young. Obviously, we are at the beginning stages of autonomy. Because there are morons out there, should we stop technology from advancing?
It's just amusing to read articles like this. Had people just like you had their way 110 years ago, we'd still be riding around in horse and buggies to this very day!
Hmm. I don't know Carl. One
Permalink
In reply to What an ignorant article. So by Carl (not verified)
Hmm. I don't know Carl. One of the "idiots" as you call them, who trusted Autopilot was an engineer at Apple. He died in a Model X crash. A second was a strong Tesla advocate and early adopter, a U.S. special forces operator, and high tech employee. He died in a Model S crash. These obviously intelligent Americans certainly don't seem like idiots to me. Maybe you are a better judge of character.
BTW. Try telling the truth
Permalink
BTW. Try telling the truth for a change. There have been 3 deaths using Tesla AP. Tesla's AP has been used for a total of 5.5 billion miles do far. The national average for human driven cars is 1 death every 100 million miles.
So do the math, had humans been driving those 5.5 billion miles, there would have been 55 deaths. So in reality, Tesla's AP has saved 52 lives.
But, hey, why report the real facts when they stand in the way of your agenda. It's hard to manipulate people if you tell the truth.
The media is widely
Permalink
In reply to BTW. Try telling the truth by Carl (not verified)
The media is widely considering the death of Mr. Banner in May of this year the fourth known autopilot fatality. A quick keyword search will help you find reports saying so. Banner was killed in a Model 3. So every current Tesla Model has now had a fatal Autopilot-controlled crash. Mr. Banner's accident was as close to identical to the death of Joshua Brown's fatal accident as two crashes can be. Other than they were occupants in different Teslas being driven by Autopilot when they were killed. Three years apart.
John, you didn't address Carl
Permalink
John, you didn't address Carl's point. Looking at it from the fatality averages, Tesla's AP has saved 51-52 lives when compared to humans, but you only fixate on the 3-4 lives lost. And keep in mind that most all of those lives lost were drivers that were not paying attention and did not avoid the accident either, even though Tesla warned them that this stage of AP is a driver's assist tool. Every time that I see you bring up one of these AP accidents, implying that autopilot is fatally flawed, I immediately think of the many similar human accidents that happen every day. Do you think that the guy who is blindly answering a text message is going to see the fire truck or police trooper who is positioned out in the middle of a driving lane? Nope. This kind of accident happens ALL the time. But for you it is big news when a computer is driving, and you can blame Tesla for being imperfect. And I have to remind you that no driving system is perfect, especially us humans. The driver of the Tesla said that he was attending to his dog in the back seat of the car when the accident occurred, and he kept driving after he struck the police cruiser until he was pulled over by the second cruiser. Your article didn't once mention that nobody was injured at all in the accident.
What are the fatality
Permalink
In reply to John, you didn't address Carl by DeanMcManis (not verified)
What are the fatality averages overall for luxury-priced vehicles with advanced driver aids and active safety? How does Tesla rank? Why would one compare Tesla's performance to older vehicles, lower-priced vehicles, or vehicles not equipped with advanced active driver aids? The news isn't just "big" for me. I first heard of this latest Autopilot-related incident while watching New England Cable News (A Comcast station serving 7 states). Which picked it up form the CT State Police reports. A quick scan of the keywords will show every national news outlet has picked up this latest chapter in an ongoing Tesla safety story, now in its fourth year. Tesla fans want to brush it under the rug. Like they do any story that isn't an advertisement for Tesla. Elektrek's latest Autopilot story is "Watch Tesla’s Autopilot-powered safety features stop for pedestrians in impressive tests." So, apparently Autopilot safety stories are news at EV-advocacy publications, but just when the news is good.
These are accidents that we
Permalink
These are accidents that we are talking about, and Tesla's AP is safety equipment that helps prevent accidents. First off, the Tesla Model 3 is not a "luxury-priced" vehicle. The average new vehicle's price in 2019 is just over $37K, and the average new pickup truck price is $48K, so the Tesla Model 3's price is right in the middle. Unless you consider average pickup trucks as also being luxury vehicles. The reason why you would compare the Tesla's safety to all the other vehicles out there is simply that it is important to look at all vehicles when seeing if any safety technology brings an improvement. That is just using scientific method to look at the problem of automobile accidents, and compare this AP technology against other safety technologies out there on the road. Which includes all the other vehicles out there with poor, older safety equipment. The reason why this Tesla crash story is picked up all over is because there are active Tesla stock short-sellers that make money every time they publish another story that makes Tesla look bad, and because they have been so thorough about making sure the bad stories keep coming (to make more money), the news outlets pick it up because it gets hits. I have no doubt that Toyota and Honda and every other automaker's safety systems also cause accidents (likely at a higher rate than Tesla's) because no computer system is ever 100% perfect, but because there aren't groups of silent investors betting against Honda and Toyota and other automaker's success, those stories are not being written and distributed with such vigor. Just follow the money and you can see the key motivations. When was the last time that you published any story about Tesla's autopilot preventing an accident? I cannot recall a single story.
Of course the Model 3 is
Permalink
In reply to These are accidents that we by DeanMcManis (not verified)
Of course the Model 3 is luxury priced. It is a compact sedan with the same space and capacities as a Hyundai Elantra or Honda Civic costing literally half as much. You like to cite the "starting price" of the Model 3 which you know is no place close to the average transaction price for the Model 3's that have been sold with Autopilot. The last time I personally reported on a positive Tesla safety system result would be October 29th. The Model 3 was included in the pedestrian safety system test results conducted by IIHS. It finished just behind the Subaru and Nissan in the rankings. On September 19th, I published two stories here about the Model 3 having earned the Top Safety Pick Plus ranking from IIHS. That was newsworthy since no Tesla had ever achieved that before. "Safety system works as advertised" is not exactly news, but if you like that sort of paid promotion, the Tesla advocacy publications have it available daily.
The Tesla Model 3 is not a
Permalink
The Tesla Model 3 is not a compact car, or an economy car. It is a mid-sized/intermediate car, and as I mentioned it is priced right in the middle of the average sales price of cars in America. Autopilot is included in that price, but Full Self Driving is optional. Addressing your main point about full self driving. I agree that when that feature goes live, their software and hardware will be needed to approach error-free driving. Which is indeed a huge goal, because humans are nowhere near 100% error free, and there are these rare and extreme cases where vehicles are parked not where they are supposed to be (like the middle of a driving lane) to contend with for all drivers. When viewed objectively, Tesla's advanced software is equal to or better than the best rival offerings, but this is a very small group of competitors. Part of Tesla's problem is that they are labeling the top self driving option "Full Self Driving", and that name implies that the Tesla models with those features, software and hardware will be able to deliver level-5 autonomous driving, whereas currently Autopilot is operating at level 3 autonomy. And like other current level 2 and level 3 driving systems, they are not 100% without error. Which is why they require drivers to monitor the road and intervene when necessary (which the drivers in these rare accidents did NOT do). It is a valid question whether any commercial auto driving system is ready to fully take over all driving duties, but i believe that Tesla's self learning system is the best one out there today, considering it's scope and adaptability. From the mess that I see just in my 80 mile daily commute I would say that even being an alert and experienced driver, I face driving challenges of impatient drivers, inattentive drivers, and intoxicated drivers daily. Would i trust Tesla's FSD Autopilot to handle those conditions, and get me to work and back safely? Yes. But then again I am expecting 99.9% accuracy, not 100%.
I think you are right that 97
Permalink
In reply to The Tesla Model 3 is not a by DeanMcManis (not verified)
I think you are right that 97 cubic feet of passenger space is midsised/intermediate. I looked up the average transaction price of cars that size according to KBB (via PR Newswire). The ATP for vehicles this size is $25,775. The Model 3's ATP is double that. KBB lists the entry-level luxury car ATP as $42,975. Still significantly below the ATP of the Model 3. Yes, we know that Tesla does offer a "Bait and switch" (Elektrek's descriptor) low-priced trim by special order. Just as one can theoretically buy a new Elantra the same size as a Model 3 for around $19K. My point is that comparing the Model 3's crash record to that of all cars on the road is silly. It should be judged within its price segment and against vehicles with active safety systems and advanced driver aids. "Other cars crash too" is not satisfying to me as an excuse why Tesla's are killing occupants while operating on Autopilot and have crashed into so many parked first responder vehicles. Others may be cool with it. I get that. Here is the ATP data source: https://www.prnewswire.com/news-releases/average-new-car-prices-up-nearly-4-percent-year-over-year-for-may-2019-according-to-kelley-blue-book-300860710.html
I understand that you are
Permalink
I understand that you are trying to be dramatic, but Tesla's autopilot technology has shown to have far fewer accidents as the average vehicle. It seems silly to narrow any comparison to a small group of cars, when the point of any safety technology is to drive out in the real world, with a full population of vehicles. Is Tesla's autopilot safer than driving in most cars?, absolutely. If you were evaluating safety belts at a time where most vehicles didn't have them, would it show their safety value if you only compared them to other vehicles with seat belts? No, the value of showing the advantage of any safety technology is when being compared to ALL vehicles. One key additional point in this is that other automakers implementing semi-autonomous driving are not going to advertise any accidents for reasons of liability and reputation, so you are not going to get valid comparison data anyways. Another point is that Tesla has far more cars on the road using their autopilot technology. Tesla revealed their safety record for 2019 as: “In the 1st quarter, we registered one accident for every 2.87 million miles driven in which drivers had Autopilot engaged,” the automaker said today. “For those driving without Autopilot, we registered one accident for every 1.76 million miles driven. By comparison, NHTSA’s most recent data shows that in the United States there is an automobile crash every 436,000 miles.”
Could it be that all the
Permalink
Could it be that all the flashing lights are blinding the cameras or confusing the Tesla computer? The Tesla should stop (or at least try to stop) if a vehicle is stopped in front of it even if not on FSD shouldn't it? We need to understand what happened in these cases.