Autopilot, the Tesla self-driving, or rather semi-autonomous driving system is not actually meant to be turned on without an attentive driver at the wheel. It’s mean to supplement a driver’s reflexes, not replace them. Tesla has neglected to make any kind of pressure sensitivity on the steering wheel or some sort of system that makes the car pull off the road if the driver is no longer actively piloting the vehicle.
I feel that this puts the responsibility for any wrecks fully in Tesla’s hands as they have not made an attempt to stop people from making stupid decisions, other than telling them that it’s a stupid decision. Cigarette/Tobacco companies must pay millions of dollars each year for “Stop Smoking” advertisements, because use of their product is correlated to the development of lung cancer.
The Autopilot’s manual clearly states that it is only to be used on divided highways and yet it is possible to turn it on while on any road. If the system is smart enough to follow the rules of the road enough to help you, it needs to be able to do it on its own in case you fall asleep. That’s the entire point of the system. If it is not smart enough to tell when it is safe to turn on, is it even smart enough to be turned on in the first place?
Ford’s Escape Hybrid is in testing to become fully autonomous. Ford is performing the tests on the streets of “Pittsburgh, Palo Alto, Miami, Washington, D.C., and now Detroit,” utilizing an artificial intelligence called Argo AI. While I’m sure many, if not most or all of the folks that helped to create Argo AI hold drivers licenses, I must question whether the AI itself will be capable of passing a Government driving test. Argo AI should be required to obtain a driver’s license. If this is to come from Ford, as seems to be the case with Ford doing all of the testing, I do not feel that the owner of the vehicle can have any responsibility for the vehicle’s interactions in wrecks. The car is “able to drive itself” which means that it must be able to take responsibility for what happens if it fails to predict all of the possible outcomes of an on-the-road encounter.
When human driver operate a vehicle, they must have a license and insurance, because they must 1.) prove that they are capable of handling the vehicle, and 2.) be able to make repairs to any persons or vehicles that their motor-vehicle causes damage to. If the Owner of the vehicle is not operating the vehicle, they cannot be held responsible, because they are not operating the vehicle. The AI is operating the vehicle. The AI is responsible.
In the past, we had self-driving vehicles. They were called Horses. They typically didn’t get into many wrecks on their own. It wasn’t until you hitched a wagon to them that they stopped being safe to drive themselves. Many cowboys fell asleep in the saddle, either from working or partying too hard, and still made it home safely, thanks to their trusty steed. That’s the ideal outcome for self-driving vehicles: something that takes care of you, and itself, and others, even if you (the driver) can’t. Horses may be self driving, but carriages are not. A horse cannot just drive a carriage without someone at the reigns, because a horse cannot read signage or truly understand that it has a huge contraption behind it and exactly how that contraption will behave. Inexperienced drivers have difficulty backing a trailer; why would a horse be expected to be any different, it’s just a horse. Carriages never marketed themselves as self-driving. They never took advantage of the horse’s own self-preservation instinct in the way that just riding a horse can.
The next generation of vehicles is expected to do just that; to take over for the driver. When professional truck drivers are involved in motor vehicle collisions, it is common to see extremely large legal settlements.
Across the nation, there are billboards dedicated to attorneys who specialize in motor vehicle collisions. When a self-driving car causes someone to lose their life, who, if not the AI, should be held accountable? The AI was technically at the wheel. The AI was programmed by a company who has determined that the AI is competent enough on the road to keep people safe. If the AI is completely in control, the owner should not be required to even maintain insurance for the vehicle as it should be under strict product liability against the manufacturer. This said, I am not an attorney. This is not legal advice, this is opinion, and solely mine.
I am reluctant to put my life at the hands of any computer, even though I love technology. I am sure that autonomous passenger vehicles are soon to come. I’m sure that there will be new legal precedent set, but who will come out as the protected parties? Will they favor the owner and the stricken, or will what is effectively a passenger in their own car be held responsible for another driver’s actions?
Frank DiMuccio has been interested in the automotive industry since his childhood. In high school, he spent his free-time rebuilding his car and earned a newfound enjoyment of the grease and sweat of working in the garage. He can be followed on Twitter at @Fdimuccio4 for daily automotive news.
Comments
Autopilot industry or auto
Permalink
Autopilot industry or auto driving have many issues for the future environment of transportation. The industry must work more for the safety or driving.