A jury in Florida has found Tesla partially responsible for the 2019 crash, which includes the company’s auto -pilot self -driving feature, Washington Post Reports, as a result, the company will have to pay 200 million damages.
The auto pilot is already installed on Tesla’s cars and handles things like collision and emergency breaking. Tesla has avoided taking responsibility for accidents involved in cars with most auto pilot Active, but the Florida case went differently. The jury finally decided that self -driving tech enabled driver George McGji to take his eyes off the road and kill a couple, Nabe Benwids Lyon and Dilin Angolo, eventually killed one and seriously injured the other.
During this case, Tesla’s lawyers argued that McKie’s decision to take her eyes to reach her phone was the cause of the accident, and that the auto pilot should not be considered. The family of the plaintiffs, Angolu and Benyidus Lyon argued that the way Tesla and Elon Musk talked about this feature eventually created the illusion that the auto pilot was really safer than. “My imagination was that it would help me, if I had a failure … or I should make a mistake,” McGji said on the stand. “And in this case I feel like he failed me.” According to the jury finally assigned two -thirds responsibility to Mac ji and the third Tesla, according to NBC News.
In the National Highway Traffic Safety Administration, an auto -pilot investigation from 2024, the misuse of the driver of the Tesla system was accused of accidents and not the system itself. The NHTSA also found that the auto -pilot was excessively legitimate and “did not properly make sure that the drivers kept their attention to the driving task,” which was involved in the 2019 Florida crash.
Although Auto Pilot is the only component of Tesla’s self -driving features, the idea is selling the idea that the company’s cars can safely drive itself is an important part of the future. Elon Musk claims that the full self -driving (FSD), which is a paid upgrade in the auto -pilot, “is safer than human driving.” Tesla’s robotoxic service relies on FSD to be able to work with any or least monitoring, which produced mixed results after the first few days.


