Tesla Tries To Address Autopilot's Fatal Flaw Ahead of Full Self Driving Update

Work for Torque News, follow on Twitter, Youtube and Facebook.

Tesla's Autopilot system has been involved in multiple crashes of Tesla cars with stationary objects. Here is what Tesla is trying to do to protect itself from liability and to protect its owners.

Tesla markets and promotes its Autopilot system as a technology that drives the vehicle in certain situations to entice buyers. Tesla also offers a long list of qualifiers to try to keep regulators and lawyers from blaming the company when the Autopilot system actually does drive the car and then crashes. Included in this fine print is that the driver must maintain control at all times (which begs the question of what point Autopilot serves) including steering, or being prepared to steer in the event of an emergency.

One difficult middle ground for Tesla has been how much contact with the wheel or movement of the wheel the occupant/driver must have. After the first fatal Tesla crash involving Autopilot, the NTSB released a series of findings and recommendations on how Autopilot should be improved to prevent more crashes. The group found that the driver of that vehicle, a former U.S. military special forces member and high-tech consultant, had an over-reliance on the automation and a lack of understanding of the system limitations. That over-reliance translated to the driver not steering or braking when the vehicle headed straight into the side of a tractor-trailer in broad daylight. The NTSB's most exacting observation was the following:

• The way in which the Tesla “Autopilot” system monitored and responded to the driver’s interaction with the steering wheel was not an effective method of ensuring driver engagement.

Since the first fatality, Tesla has modified its Autopilot system many times. Yet, Teslas with autopilot enabled have since crashed into parked firetrucks twice, a police car once, and into a highway lane divider, the last of which was another fatality. Obviously, there is a disconnect between how Autopilot can safely function and how some users are employing the technology. The latest crash was just this month.

To try to both offer more protection, and also possibly to cover its back, Tesla is again modifying its Autopilot system. The latest update, detailed perfectly by one owner in the video below, adjusts the steering of the vehicle and also changes how and how often a driver/occupant must interact with the wheel. This is Tesla trying to engage the driver and make sure they are attentive. Or to create plausible deniability if (when?) another crash occurs. Elon Musk discussed the most recent update to Autopilot's fatal flaw via Twitter. Here is the exchange:

You can be the judge as to whether this is a real attempt at solving a problem that the NTSB says contributed to a fatality, and that appears to be also part of why a second fatality occurred.

The second fatality, also in broad daylight, killed a high-tech worker employed by Apple who was not new to Autopilot. A post-crash analysis by NTSB revealed the following details regarding the steering inputs from the driver/occupant who was killed:

• During the 18-minute 55-second segment, the vehicle provided two visual alerts and one auditory alert for the driver to place his hands on the steering wheel. These alerts were made more than 15 minutes prior to the crash.
• During the 60 seconds prior to the crash, the driver’s hands were detected on the steering wheel on three separate occasions, for a total of 34 seconds; for the last 6 seconds prior to the crash, the vehicle did not detect the driver’s hands on the steering wheel.
• At 7 seconds prior to the crash, the Tesla began a left steering movement while
following a lead vehicle.
• At 4 seconds prior to the crash, the Tesla was no longer following a lead vehicle.
• At 3 seconds prior to the crash and up to the time of impact with the crash attenuator,
the Tesla’s speed increased from 62 to 70.8 mph, with no precrash braking or evasive
steering movement detected
.

We have added the bold highlights to emphasize the key points. In the latest fatality, the Autopilot system steered the Tesla vehicle into the lane divider. One might assume this was a unique or one-time occurrence, but it isn't. On May 22nd, on Tesla's own internal blog, member "coolnewworld" posted a story titled "Is it safe to use Autopilot - An extended real-world test." In the post, the owner says, "We took our Model 3 through mountains, deserts, narrow 2 lane highways, beach areas, hills, valleys, heavy and light traffic, construction zones, and various conditions and weather including thick fog. You name it. Autopilot was used for well over 2,500 miles of these 3000 miles driven. I completed my tests just yesterday.
Autopilot performs exceptionally well, but it isn't quite perfect yet. The worst incident (of only 2 incidents total) that I encountered? Autopilot suddenly veered towards a center divider on the Las Vegas strip."

The driver/occupant was able to grab the wheel and correct in time "easily" in his words. This post is about three weeks old and presumably the Tesla owner that posted it had the most recent updates. Even with the most recent updates, Autopilot is still being reported by loving fans as steering them into or towards solid objects.

Experts working with Tesla know this and have known it for some time. We recently asked an expert from MIT involved in a study helping to evaluate Autopilot's safety why Teslas on Autopilot keep steering into things like firetrucks and lane dividers. His reply (which was recorded on video) was, "We know there is an issue with the detection of static objects."

The news this week from Elon Musk is that Tesla will launch a new version of Autopilot in just 6 weeks that will enable "full self-driving" called FSD by Tesla fans. Presumably, the full self-driving Teslas won't drive into things, but only time will tell.

Submitted by Jesse (not verified) on June 13, 2018 - 8:52AM

Permalink

I love Tesla but I am absolutely 100 percent against any form of autonomous driving. If Tesla is smart, they should disable all autopilot and refund people the difference. People need to pay attention to the road. Airplanes have “autopilot” but the pilot is still paying attention and is ready to take over at any time. The pilot is not in the back of the plane with the stewardess.

Submitted by kent beuchert (not verified) on June 19, 2018 - 5:50PM

Permalink

Only Tesla has customers who will shell out thousands of dollars for what's a non-existent capability - driverless operation. How can Tesla claim it is an autopilot when there is nothing automatic about it? Tesla blames their customers for all their problems and takes no responsibility for producing a death trap. I'm waiting for Tesla to figure out how spontaneous battery fires are also the owner's fault.