Home » Tesla Autopilot Updates: Are They Too Dangerous?

Tesla Autopilot Updates: Are They Too Dangerous?

by archyw

Tesla is testing a beta version for its autopilot again. But the first videos show many near-accidents. How secure are the autopilot updates? And: should the software updates be more strictly regulated? A new US regulation seems to target Tesla.

Tesla’s autopilot has been a topic of discussion for years. Because time and again, Tesla drivers are involved in fatal accidents while they have activated the driver assistant.

Now Tesla is testing a new autopilot update for selected beta users. However, the first videos show worrying near-accidents that make the assistance system appear anything but safe.

Autopilot Updates: Worrying Videos

Electric car fan Taylor Ogan shared the video of such a test drive with the autopilot update version FSD Beta 9 on Twitter. In the short excerpt you can see the driver with the activated autopilot letting go of the steering wheel and thereby almost causing an accident repeatedly.

In the full video, the car almost seems to drive into another vehicle several times. The autopilot also seems to have massive problems correctly recognizing lanes, so that the test driver has to take over the steering wheel over and over again.

Accordingly, Ogan, but also many other users, criticize the fact that Tesla calls this technology “fully self-driving”. Because the problem is not so much the technology behind the autopilot, but rather the name.

Read Also:  "Anytime ...": dramatic warning about Alexei Navalny leaves Russia in suspense

Tesla autopilot is not fully self-propelled

While many experts believe that the assistance system at Tesla is one of the best in the industry, terms such as “autopilot” or “fully self-driving” mode are misleading.

Because they suggest that the technology is a (fully) autonomous vehicle, i.e. at least one vehicle of autonomy level three according to the classification of the Society of Automotive Engineers (SAE).

Correspondingly, drivers behave inside in such a way that, with the autopilot activated, they often take their eyes off the road and their hands off the steering wheel and assume that the Tesla can drive by itself.

Tesla autopilot is an assistance system

Only: he can’t do that. Strictly speaking, the Tesla autopilot is an assistance system, albeit a very well developed one. It corresponds to SAE level two, at which the vehicle can temporarily take over certain functions, but the driver’s attention is required at all times.

Tesla keeps pointing this out to drivers. Nevertheless, serious and fatal accidents occur time and again because drivers trust the system too much.

It is even more problematic when – as in the video – a not yet fully developed beta version of an autopilot update is on the road.

Regulate autopilot updates

Accordingly, consumer protection organizations as well as motorists repeatedly demand that the test versions of the autopilot updates in particular need to be more strictly regulated.

In fact, it is questionable how the beta testers, who also have no special qualifications for such journeys, use the technology in road traffic. In doing so, you may endanger other people who are not even aware that someone is driving in autopilot mode.

Read Also:  Russia is open to any cooperation in aviation and astronautics

In Germany, for example, something like this would not be allowed under the road traffic regulations. But there could soon be more restrictions in the USA too.

New law forces reporting

The US federal road and vehicle safety agency NHTSA has now introduced a new rule that seems to target Tesla’s autopilot, as some suspect.

It could also be an attempt at redress by the NHTSA. Because in the past, the authority gave the Tesla autopilot very good marks in a study. On closer inspection, however, the study was found to be incomplete.

According to the new regulation, automakers must now report accidents within one day if assistance systems are involved. This applies to assistance systems from autonomy level two to five. So this would include Tesla’s autopilot as well.

So far, such reports have been on a voluntary basis in order to protect industrial secrets and innovative technology developments – at least at the federal level. In some US states, such as California, where many automakers are testing their autonomous technologies based on weather conditions, the rules are stricter.

Tesla is silent

Almost all automakers in the US have welcomed the new rule – with the exception of Tesla. The company has so far remained silent on the subject. The car manufacturers hope that this will result in less abuse and better acceptance of the technology, which has had a bad reputation due to the Tesla accidents.

It is therefore quite possible that Tesla could improve its updates as a result, in order to avoid accidents in the future. For example, there are indications that Tesla would like to bring out a camera that will monitor the behavior of drivers in autopilot mode.

As far as is known, since the introduction of the Tesla autopilot in 2015 there have been 19 fatalities worldwide from accidents in which the technology was activated.

Also interesting:

0 comment
0

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.