Teslarati is reporting that they are getting letters from Tesla owners who convey their worries over the radar loss in Tesla Autopilot and FSD for inclement weather.
The question is this, how is the Pure Vision in Tesla self-driving going to work during bad weather?
An email from an Australian reader seemed to narrow in this point even further. A man named Peter emailed me and stated that his Model 3 recently identified a truck that was ahead of him but concealed in an opaque, white mist several car links ahead of his vehicle. “I assumed that visualization was created as a result of radar. In those conditions, the message multiple cameras blocked or obstructed appeared and the autopilot screamed and handed over,” Peter said.
Tesla has removed radar from the company's Autopilot suite and replaced it with a "pure vision" system. This is expected to be the final major update before FSD wide release.
In an update to the company’s online configurator five days ago, Tesla had announced the company is moving from a radar-vision hybrid self-driving system to one that solely relies on cameras.
Previously the Autopilot section of Tesla’s online configurator used to state the vehicles offer 160 meters of “forward protection” using a radar.
However, now Tesla has removed any mention of radar, and the autopilot section only mentions “Powerful visual processing at up to 250 meters of range."
A user, named Gruff, has an interesting comment that may shed new life on the perspective you may have over Tesla's Pure Vision vs Radar loss.
"Teslas might be fitted with a radar now, however, nobody outside Tesla has a clue how much it is used in the Autopilot code or what priority it is given in the DNNs, whether it is a primary input or treated as a secondary input that is more troublesome and noisy than it is worth. What is somewhat odd is why Tesla would choose to announce this before pure vision is fully established, but then that does seem to be Tesla's way with software. My main concern with the current camera setup is the image quality is nowhere near the human eye, and the ease with which the cameras are obstructed by rain/dirt, etc. This needs addressed."
Armen Hareyan is the founder and the Editor in Chief of Torque News. He founded TorqueNews.com in 2010, which since then has been publishing expert news and analysis about the automotive industry. He can be reached at Torque News Twitter, Facebok, Linkedin and Youtube.