In an Odd Attempt at Defence, Tesla Claims that Elon Musk Claims on Self-Driving Cars “Might Have Been Deep Fakes”

In a lawsuit filed by the relatives of an owner of a Tesla who passed away in a crash while operating Autopilot several years ago, Tesla has chosen to employ an odd defence.

The manufacturer asserted that Elon Musk, the company’s CEO, shouldn’t be permitted to discuss some of his public comments on self-driving because they could have been “deep fakes.”

The case is centred on the passing of Apple engineer Walter Huang aboard his Tesla Model X when commuting to work in 2018.

As previously mentioned, the Model X was running on Autopilot when it approached the median of a highway ramp as if it were a lane and collided with a barrier around 150 metres later.

Because the crash attenuator had previously been destroyed from a prior crash, the resultant damage was quite severe. The motorist was taken to the hospital in a critical condition but later passed away.

The NHTSA reviewed the crash and determined that the car was operating on Autopilot when it happened, but it placed the blame for the collision on the driver, who, according to phone data, had been playing a game on his cell phone at the point of the collision, as well as the absence of a crash attenuator.

When using Autopilot, Tesla advises users to always pay attention and to be prepared to take the wheel.

The Huang family nonetheless made the decision to file a lawsuit, and they are attempting to make the case that various statements made by Tesla and, more specifically, by CEO Elon Musk regarding Autopilot and self-driving technology, gave Huang the impression that he could utilise Autopilot in the way that caused the accident.

The lawsuit is scheduled to proceed through trial in Santa Clara County Superior Court later this year, but Tesla has made an odd defence to try to exclude Musk and his words from the case.

The automaker is arguing that since some of the claims Musk is said to have made may have been “deep fakes,” he shouldn’t be interrogated about them.

Deep fakes primarily refer to synthetic media that has been digitally altered to successfully swap out one person’s likeness for that of another, but the term is also used to describe CGI videos created to make someone appear to say something they did not actually say.

Evette D. Pennypacker, the judge, rejected the defence. In her opinion, she stated (according to The Telegraph):

They contend that Mr. Musk’s public utterances are immune since he is well-known and may be a greater target for deep fakes.

In other words, Mr. Musk and anyone in his position are free to publicly say whatever they want and then use the possibility that their recorded utterances are a complete fabrication to avoid accepting responsibility for what they actually said and did.

According to her decision, Musk should be made accessible for a three-hour interview to explain his comments regarding the Tesla Autopilot and Full Self-Driving.

E Auto Arena’s Opinion

If Tesla believes that some of the claims are profound fabrications, it should specify which ones and make an effort to substantiate its claims. Although it’s doubtful that anyone is immune to having their statement verified, they can create deep fakes. Bizzare!

Tesla Claims that Elon Musk Claims on Self-Driving Cars

Follow EAutoArena On TelegramTwitterGoogle News & Youtube

Leave a Comment