Elon was asked if people should be able to send bug reports when these kinds of things happen, so they do not count against them. Other users said that “...collision warning seems to be getting everyone bad ratings on the safety score”. Yet another user said ”...I’m getting randomly dinged for using 4-5 on some rides being super extra careful. Other rides are perfect and I’m doing the same driving”.
Noted. You should be able to press mic button & say “bug report …”— Elon Musk (@elonmusk) September 26, 2021
Generally speaking, there are 5 levels of self driving or “automated driving”: the classification of the development stages up to the self-driving vehicle comes from the Society of Automotive Engineers (SAE) and describes the extent to which the vehicle can and may take over the tasks of the driver. The levels range from 0 with no assistance systems at all to Level 5, which describes fully autonomous driving. The details are as follows:
Level 0: No Automation. The driver is completely responsible for controlling the vehicle, performing tasks like steering, braking, accelerating or slowing down. Level 0 vehicles can have safety features such as backup cameras, blind spot warnings and collision warnings. Even automatic emergency braking, which applies aggressive braking in the event of an imminent collision, is classified as Level 0 because it does not act over a sustained period.
Level 1: Driver Assistance. At this level, the automated systems start to take control of the vehicle in specific situations, but do not fully take over. An example of Level 1 automation is adaptive cruise control, which controls acceleration and braking, typically in highway driving. Depending on the functionality, drivers are able to take their feet off the pedals.
Level 2: Partial Automation. At this level, the vehicle can perform more complex functions that pair steering (lateral control) with acceleration and braking (longitudinal control), thanks to a greater awareness of its surroundings.
Level 2+: Advanced Partial Automation. While Level 2+ is not one of the officially recognized SAE levels, it represents an important category that delivers advanced performance at a price consumers can afford. Level 2+ includes functions where the vehicle systems are essentially driving, but the driver is still required to monitor the vehicle and be ready to step in if needed. (By contrast, Level 3 represents a significant technology leap, as it is the first level at which drivers can disengage from the act of driving — often referred to as “mind off.” At Level 3, the vehicle must be able to safely stop in the event of a failure, requiring much more advanced software and hardware.) Examples of Level 2+ include highway assistance or traffic jam assistance. The ability for drivers to take their hands off the wheel and glance away from the road ahead for a few moments makes for a much more relaxing and enjoyable experience, so there is strong consumer interest.
Level 3: Conditional Automation. At Level 3, drivers can disengage from the act of driving, but only in specific situations. Conditions could be limited to certain vehicle speeds, road types and weather conditions. But because drivers can apply their focus to some other task — such as looking at a phone or newspaper — this is generally considered the initial entry point into autonomous driving. Nevertheless, the driver is expected to take over when the system requests it. For example, features such as traffic jam pilot mean that drivers can sit back and relax while the system handles it all — acceleration, steering and braking. In stop-and-go traffic, the vehicle sends an alert to the driver to regain control when the vehicle gets through the traffic jam and vehicle speed increases. The vehicle must also monitor the driver’s state to ensure that the driver resumes control, and be able to come to a safe stop if the driver does not.
Level 4: High Automation. At this level, the vehicle’s autonomous driving system is fully capable of monitoring the driving environment and handling all driving functions for routine routes and conditions defined within its operational design domain (ODD). The vehicle may alert the driver that it is reaching its operational limits if there is, say, an environmental condition that requires a human in control, such as heavy snow. If the driver does not respond, it will secure the vehicle automatically.Level 5: Full Automation. Level 5-capable vehicles are fully autonomous. No driver is required behind the wheel at all. In fact, Level 5 vehicles might not even have a steering wheel or gas/brake pedals. Level 5 vehicles could have “smart cabins” so that passengers can issue voice commands to choose a destination or set cabin conditions such as temperature or choice of media.
With Level 5 we have arrived at actual autonomous driving: unlike the previous levels, neither driving ability nor a driving license are required to use the vehicle. The driver becomes a pure passenger.
According to the Washington Post, “...This weekend’s release would make it available to those who have purchased the now-$10,000 software upgrade, and those who have purchased a subscription from Tesla for about $100 to $200 per month — if they can first pass Tesla’s safety monitoring”.
A beta tester known on YouTube as HyperChange tested out the update. While he said his experience was overall positive, in the video that was designed to test FSD's ability to navigate around Seattle's Monorail the Tesla at one point nearly turned right into a group of pedestrians crossing the street. Another driver had a similar experience when he tested the FSD near the San Jose Light Rail. In AI Addict's video, the Tesla attempted to turn right and completely skidded over a curb. The driver pointed out that the software turned right without taking the light rail into consideration, as the rail took up the right side of the street and forced the car further left in its turn. Despite the bugs, many Tesla beta testers are reporting that the system is getting better, especially when it comes to decision making. The update also spawned compliments over updated self-driving graphics.
Now that we have a better understanding of the different levels in “automated driving”, or FSD as it is called in Tesla, and going back to the bug reporting issue, the procedure is just ”...tap the microphone icon and say “bug report. (and then the bug you found.)”. Tesla will get a copy of your report”. It is THAT simple.
Nico Caballero is the VP of Finance of Cogency Power, specializing in solar energy. He also holds a Diploma in Electric Cars from Delft University of Technology in the Netherlands, and enjoys doing research about Tesla and EV batteries. He can be reached at @NicoTorqueNews on Twitter. Nico covers Tesla and electric vehicle latest happenings at Torque News.
Comments
FYI, this is bur report is
Permalink
FYI, this is bur report is not new(just owners never read their manuals)....My 2017 owners manual, dated 9/5/2017, has the following:
Note: You can also use voice commands to
provide feedback to Tesla. Say "Note",
"Report", "Bug note", or "Bug report" followed
by your brief comments. Model X takes a
snapshot of its systems, including screen
captures of the touchscreen and instrument
panel. Tesla periodically reviews these notes
and uses them to continue improving Model X.
Well noted, thank you very
Permalink
In reply to FYI, this is bur report is by Daryl Underwood (not verified)
Well noted, thank you very much for the info.
I just got enhanced FSD a few
Permalink
I just got enhanced FSD a few days ago. I was very impressed although I found the steering wheel somewhat jerky in some acute turns. I used to be a driving instructor and found FSD very similar to driving with a student driver. It seems that the cameras are not looking far ahead. They should be evaluating the width of the road and the number of lanes possible in that space. At the same time it should be visualizing where the car is with respect to where it will be very soon. It should not jerk the steering wheel other than if very close it is doing something very wrong. Sudden movements surprise other drivers and make the ride uncomfortable and often dangerous. So AI should be able to check each extremity of the lane, determine where the center of the lane is and make sure the car is going there many times per minute. If lines are erased or are absent it should be able to determine where the lines should be. If lines reappear it should learn to better evaluate next time. It should also use the back camera in the same way a driver uses his or her mirrors. It should tell him when to accelerate more rapidly or brake more smoothly. The more it does everything smoothly the less it surprises you and other drivers as well. Looking far ahead is vital for easy driving for a human student. It should and will be the same for an AI driver.
Regrettably "Bug Report"
Permalink
Regrettably "Bug Report" records too small of a time window to adequately describe a problem. For example, one could find that when you say "Set max battery charge level to 75%" the Tesla responds by setting the drivers side thermostat to high. Getting those two sentences in in that time is pretty difficult. They really need a bugzilla portal or similar.
As a new Tesla owner of a
Permalink
As a new Tesla owner of a model Y, I would like to draw your attention to what I feel is an important accident causing issue. That of the rear camera's serious distortion when it incurs rain or water covering. On two occasions, recently when attempting to back up into a standard parking spot (one time at a Tesla charging station) the rain had made the rear camera view so distorted that it was impossible to know the exact correct guidelines. This happened again yesterday when doing a similar back up into a parking spot while it was raining (hard).
I asked a friend who also owns a new model Y and he said he had also experienced this distortion.
I am requesting a prompt response to this important concern so that it may be corrected. I would think that this issue has already been illuminated.