Tesla has announced that it will ditch the use of sensors in all of its electric cars in favor of the cameras and computational neural networks that make up the Tesla Vision system.
Tesla has announced that it will begin to do without the use of ultrasonic sensors currently used in Autopilot, in favor of the Tesla Vision system, which only uses cameras to map the surroundings. This is a decision that seems to go against what the rest of the car industry is developing, which is based on cameras, sensors and, above all, LiDAR radar to feed data into the software that makes decisions and informs the driver.
Last year, Tesla already announced that it would stop implementing radars for its Autopilot system, thus initiating a transition to the new Tesla Vision system based only on cameras; and that this Tesla Vision would begin to be used in Tesla Model 3 and Model Y units manufactured from May 2021 on. Originally, the array of sensors and cameras that made up the autopilot hardware were all that was needed to eventually achieve fully autonomous driving capability. Implemented in all its EV models, it included eight cameras, a frontal radar and several ultrasonic sensors that surrounded the vehicle.
The transition to the new Tesla Vision system means changing the original concept to use only virtual computer vision, based solely on the data sent by the cameras and dispensing with sensors and radars. Logic itself would suggest that the more data the system is capable of collecting, the safer it will be, since the information can be verified to avoid errors. However, Tesla's idea is to follow the natural process used by the human brain that basically "navigates" with a system that only uses spatial vision and artificial intelligence. The cameras capable of seeing the entire environment, the computer capable of interpreting it, and the neural networks capable of learning through artificial intelligence and data are the “technological substitutes” for these human senses. Tesla believes that the data from the radar and sensors could even be counterproductive, because it could eventually contaminate the system.
The change announced by Tesla last year had its direct consequences on some of the functions of Tesla Autopilot, which will be limited in vehicles that do not include radar. Thus, for example, the maximum speed at which the automatic steering of vehicles with Tesla Vision worked was 75 mph. Now Tesla announces that it is going one step further by eliminating ultrasonic sensors, replacing them with camera vision technology: “Today we are taking the next step in Tesla Vision by removing the ultrasonic sensors (USS) from Model 3 and Model Y manufactured in the United States. We will continue this decision globally on these same models over the next several months, followed by Model S and Model X in 2023".
Ultrasonic sensors (USS) are mainly used for the detection of nearby objects, especially in low-speed maneuvers such as parking (automatic or manual) and collision warnings. Tesla has explained how its neural networks capable of “seeing” through cameras replace these sensors: “Along with the removal of the USS sensors, we launched our vision-based occupancy network, currently used in the beta version of Full Self -Driving (FSD), to replace the entries generated by these USS. With current software, this approach gives Autopilot high-definition spatial positioning, longer-range visibility, and the ability to identify and differentiate between objects. As with many Tesla features, our occupancy network will continue to improve rapidly over time".
The current software temporarily limits some functions that are possible with the use of ultrasonic sensors. During the brief period of time during this transition, Tesla Vision vehicles not equipped with USS will be delivered with some features temporarily limited or inactive: Park Assist, Autopark, Manual Summon and also the smart feature (Smart Summon). In the near future, once these features reach par performance with current vehicles, they will be restored through a series of over-the-air software updates.
In the meantime Tesla basically keeps the following features: forward collision warning, automatic emergency braking, lane departure warning with avoidance, lane departure prevention, pedal misapplication mitigation, automatic high beam, blind spot warning, automatic steering , automatic lane change, autopilot navigation and control over traffic lights and stop signs.
Source: tesla.com
All images courtesy of Tesla Inc.
Nico Caballero is the VP of Finance of Cogency Power, specializing in solar energy. He also holds a Diploma in Electric Cars from Delft University of Technology in the Netherlands, and enjoys doing research about Tesla and EV batteries. He can be reached at @NicoTorqueNews on Twitter. Nico covers Tesla and electric vehicle latest happenings at Torque News.