Elon Musk showed Tesla FSD version 12 recently and the car drove itself around - and it did it with neural nets only.
How the Car Drives Itself
Elon Musk recently showed off Tesla FSD version 12 recently, which is Tesla's software that allows the car to drive itself.
The current system has a lot of code that has conditions on what to do with stop lights, stop signs, and many other things. It's a very conditional system in the current version.
However, the version Elon Musk was using used AI code. This uses neural net programming from training data from video of Tesla cars driving. Using this cumulative set of data, the neural nets make predictive behavior based on the videos of other Tesla cars driving.
There's going to be Tesla vehicles doing what any Tesla needs to do at any given moment. With these videos, the neural nets get put into what is known as an AI brain, and can stop at a stop sign or red light, or do anything else it needs to.
The neural nets won't even know necessarily what a stop sign is or what brake lights are on a car. The car will just behave based on what it has seen before from what it has been trained on. A human is now out of the equation for how to handle the road, and an AI can handle it through video training data.
The AI can handle roundabouts, right turn lanes, cyclists, potholes, speed bumps, and many other things, and it's all from data from Tesla cars driving.
You May Also Be Interested In: Tesla needs to fix forward collision warnings in its safety score system
When Will Tesla Cars Be Autonomous
Other people are handling autonomous vehicle software by using Lidar, which is another way to capture data around the car, and they are using programming that requires humans to program conditions. This isn't very scalable, and it is prone to problems. The Lidar is also very expensive.
A car with a bunch of sensors around it and on top of it is much more bulky and expensive. The car is also mapping areas in order to understand the world around it. These companies at least have a car without a driver behind the steering wheel. It works for very specific situations under very narrow circumstances.
As long as Tesla continues to grow its fleet of Tesla vehicles into the millions and tens of millions, then Tesla will solve autonomy, and it will be soon - within the next 2 years in my opinion.
Scale is the most important thing when it comes to an autonomous vehicle. You must be able to build and produce the vehicles that are autonomous at scale. Tesla is on the path to do this with continued manufacturing improvements and building more giga factories.
You also need scale of the software and neural nets, along with scalable training data from video, to ensure the cars can handle any situation. You must scale manufacturing and software, and it must all be automated and not require manual processes wherever possible.
I believe Tesla vehicles will always be trained, even when the car is considered autonomous. In a couple of years, expect Tesla to be at least a level 4 self-driving system. In another couple years after that, Tesla will reach level 5, which is completely autonomous. This seems so far away today, but it will happen sooner than anyone thinks due to exponential growth.
What do you think? Will Tesla become autonomous with its vehicles in 2025 and level 5 autonomous in 2027?
Leave your comments below, share the article with friends and tweet it out to your followers.
Jeremy Johnson is a Tesla investor and supporter. He first invested in Tesla in 2017 after years of following Elon Musk and admiring his work ethic and intelligence. Since then, he's become a Tesla bull, covering anything about Tesla he can find, while also dabbling in other electric vehicle companies. Jeremy covers Tesla developments at Torque News. You can follow him on Twitter or LinkedIn to stay in touch and follow his Tesla news coverage on Torque News. Image Credit, Tesla, Screenshot
Very impressive…
Very impressive demonstration of the power of machine learning & neural nets.
When I think of all the FSD lines of code that will be put in the bit bucket when this becomes a FSD beta, I can hear the moans of hundreds of programmers.
I'm still not convinced that only using cameras is the way to go. The use of Cameras, LIDAR, Radar, & Ultrasonic sensors working in tandem can be combined to provide accurate imaging in various weather and distance situations, e.g., ultrasonic is great for very short distances, radar's range is not degraded by rain, fog or snow.