in

Tesla should insist its cars are NOT autonomous: another driver has died

History has repeated itself. The fatal victim of a recent traffic accident in the United States, who was driving a Tesla, may be linked to an improper use of the Autopilot system. He was known for his praise of the social media system.

On May 5 there was a traffic accident in Fontana, a town 80 kilometers east of Los Angeles (California), specifically on Interstate 210. A Tesla Model 3 hit a truck with a semi-trailer that was crossed on a highway . The driver died, two other men near the truck were seriously injured and required hospitalization.

The National Highway Traffic Safety Administration (NHTSA) is investigating what happened. At the moment it is not clear if the Autopilot system of the car was working or not. The police (California Highway Patrol) said preliminarily that the Autopilot was activated, but the next day they backed off and said it was not so clear. InitiallyIt looks like it was being used.

The victim leaves a widow and two orphans

Till the date three Tesla drivers have died in the United States due to misuse of the Autopilot, as this system requires the driver to be fully attentive and to be able to take the wheel with their hands at any time. But some still do not get into their minds.

35-year-old Steven Michael Hendrickson owned the white Model 3. He was an active member of the Tesla Club of Southern California and on his social networks it is not difficult to find videos in which he seems to have a lot of faith in the possibilities of the Autopilot system, especially on TikTok.

The NHTSA has already investigated 29 road accidents of this type in the US

He commented on things like “the best accompanying driver you can have even drives through boring traffic for me”, or “What would become of me without my autonomous Tesla after a long day at work”, or “Coming home from Los Angeles after to work, thank you God, thank you, autopilot.

The Tesla FSD system, from “Full Self Driving”, is in a closed testing phase with a number of brand customers. They have to commit to being vigilant about driving at all times, or they may be kicked out of closed beta. We don’t know if Hendrickson was part of the beta or was just overly relying on Autopilot.

Steve Hendrikson’s TikTok post from a year ago

Tesla continues to have a problem with its own customers, they are not convinced that the Tesla Autopilot is a semi-autonomous system and requires looking at the road, no matter how much it proves to be apparently autonomous. Among the weak points that are known, is not locating large vehicles crossed in its path. For example, a button, a similar accident in Taiwan.

The company led by Elon Musk has had to add limitations to force drivers to be aware of the wheel and not let Autopilot do everything. However, this can be circumvented with a roll of duct tape and string to simulate the weight of the hand, even if the driver is not physically in the seat.

In social networks, digital newspaper libraries and even YouTube we have seen all kinds of recklessness, such as drivers who are taking a nap in a traffic jam, who have changed seats, or who circulate watching a movie instead of looking at the road. They defy the laws of natural selection.

Consumer Reports demonstrated – in closed circuit – that it is possible to “trick” the Autopilot system so that the car drives apparently alone and without detecting that there is no driver seated

Until there is a binding guarantee that an autonomous system is completely autonomous, and that it can function without a driver in any type of situation, all of the above will no longer be reckless or mischievous. Right now, unambiguously, they are. And a few have already paid for it with their lives.

Teslas use various sensors to understand the world around them and make decisions in milliseconds: television cameras, radar, and short-range sonar, basically. Although it is a very sophisticated system, and one of the best in the industry, it is still not fully autonomous, and considering it as such can be dangerous.

These misuse deaths are the exception that proves the rule, if a driver uses Autopilot (but is pending driving) they are less likely to be involved in a road accident. The statistics are very clear about it.

WATCH LIVE AND FREE WRESTLEMANIA BACKLASH, TODAY SUNDAY, MAY 16, 2021

María Julissa falls in love with her followers with a flirty photograph in a pink swimsuit