Lawsuits test Tesla claim that drivers are solely responsible for crashes
Lawsuits test Tesla claim that drivers are solely responsible for crashes
Without paywall: https://archive.ph/NGkbf
Lawsuits test Tesla claim that drivers are solely responsible for crashes
Without paywall: https://archive.ph/NGkbf
Autopilot “is not a self-driving technology and does not replace the driver,” Tesla said in response to a 2020 case filed in Florida. “The driver can and must still brake, accelerate and steer just as if the system is not engaged.”
Tesla's terminology is so confusing. If "Autopilot" isn't self-driving technology, does that mean it's different from "Full Self Driving"? And if so, is "Full Self Driving" also not a self-driving technology?
I heard Elon Musk call it: "Assisted full self driving". Which doesn't make any sense. LOL
"It's called whatever will make the stock price go up."
The self in this equation is you. You're driving your self around. Full self driving 😉
The term autopilot comes from aviation, where the only kind of problem resolution an autopilot does is turning itself off.
Other than that, it just flies from checkpoint to checkpoint.
If only we could implement similar testing protocols to the aviation version to validate it's safety!
Depends on the autopilot. There are some that are as rudimentary as a "wing leveler." They only have control of the ailerons and can level the wings and maybe make turns. Other systems have control of all three major control axes and are integrated with the navigation systems so they can do things like climb to an altitude and level off, turn to a heading, or even fly holds and approaches.
They do require training on the part of the pilot to use in flight.
It's marketing
Autopilot is a more basic driver assist system than FSD. FSD is what will eventually become what the name suggests but it's obviously not there yet and everyone knows this. It's just the name of the system.
Those are really crappy names. How about "driver assist" and "supervised self driving"? Drop the "supervised" once they're ready to market it as real self driving.
FSD is just a lie because its a description of a product they intend to develop not something that exists on the car you are buying now
You can't call something Full Self Driving or Autopilot and then blame the driver. If you want to blame the driver then call it drive asist.
Right! That's why you have the FSD turn it over to the driver the moment a crash is unavoidable to make the driver liable.
"at the time of the crash, the driver was in full control"
(but not a couple seconds before)
I think Tesla should rename Auto Pilot to Darwin Award Mode.
And improve motorcycle detection as well as use LIDAR.
It's not that Teslas are killing their owners. Teslas are killing first responders to road accidents, kids getting off buses and motorcyclists. We're all exposed to the problems caused by Musk cutting out testing to save some money.
The customers pay extra in order to be beta testers. Best deal ever!
That’s just the price we have to pay for this wonderful capitalist system. Worth it!
I like calling it cruise control with extra fatalities.
Heck, even using the same sonar/radar/whatever normal cars use other than just cameras would be a huge improvement
Here is an alternative Piped link(s):
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I'm open-source; check me out at GitHub.
You're also responsible for what you do when you're drunk! Guess what. You cannot purchase ethical excuses. That's YOUR Tesla. You own it. You're in charge of it regardless of whether or not Tesla makes it impossible to access the controls.
Buyer beware. Stop buying proprietary garbage, ya idiots.
Unfortunately there is no car that isn't proprietary and even ones without "auto pilot" have things like collision detection that can slam on the breaks for you.
There are high quality reliable cars that still run great from the early to mid 2000s. They are very inexpensive compared to modern vehicles. May cost a bit more in gas.
You control them completely
Didn't he preach in the past that NOT using automated driving systems would be completely unsafe?
Confidence Man!
You're clearly not looking hard enough. Maybe you just prefer a dangerous road where nobody takes responsibility?
Good luck buying any car then. Tesla is the worst of the worst in that regard, but they’re all bad these days.
No they're not.
This is the best summary I could come up with:
SAN FRANCISCO — As CEO Elon Musk stakes the future of Tesla on autonomous driving, lawyers from California to Florida are picking apart the company’s most common driver assistance technology in painstaking detail, arguing that Autopilot is not safe for widespread use by the public.
Evidence emerging in the cases — including dash-cam video obtained by The Washington Post — offers sometimes-shocking details: In Phoenix, a woman allegedly relying on Autopilot plows into a disabled car and is then struck and killed by another vehicle after exiting her Tesla.
Late Thursday, the National Highway Traffic Safety Administration launched a new review of Autopilot, signaling concern that a December recall failed to significantly improve misuse of the technology and that drivers are misled into thinking the “automation has greater capabilities than it does.”
The company’s decision to settle with Huang’s family — along with a ruling from a Florida judge concluding that Tesla had “knowledge” that its technology was “flawed” under certain conditions — is giving fresh momentum to cases once seen as long shots, legal experts said.
In Riverside, Calif., last year, a jury heard the case of Micah Lee, 37, who was allegedly using Autopilot when his Tesla Model 3 suddenly veered off the highway at 65 mph, crashed into a palm tree and burst into flames.
Last year, Florida Circuit Judge Reid Scott upheld a plaintiff’s request to seek punitive damages in a case concerning a fatal crash in Delray Beach, Fla., in 2019 when Jeremy Banner and his Tesla in Autopilot failed to register a semi truck crossing its path.
The original article contains 1,850 words, the summary contains 263 words. Saved 86%. I'm a bot and I'm open source!
Even when the driver is fully responsible, the assistance software must work properly in all situations. And it must be tested fully.
In case the software makes severe mistakes surprisingly, normal drivers maybe don't have a chance to regain control. Normal drivers are not like educated test drivers.
The article keeps calling it “Autopilot”, which is different from “Full Self Driving”.
If they are correct, then it’s all on the driver. Autopilot is just a nicer adaptive cruise control, and should be treated as such. Many cars have them, even non-smart vehicles. Even my seven year old Subaru had similar (much dumber but similar)
That being said, people seem to confuse the names of these different functionalities all the time, including throughout this thread. However, even if they were confused and meant FSD, my car has feedback to require your hands in the wheel, so I don’t understand how you can claim ignorance
My morality says both are accountable. The driver, and Tesla. Tesla for damage caused by their system, and the driver for and if he does not retake control of the vehicle given the chance.
I would say depends. If the user was using the feature correctly then Tesla should have some liability.
Most of the crashes I’ve seen the people were not using the feature correctly.
They might be using it how Tesla markets it.
The vehicle prompts you to keep your eyes on the road and be prepaired to take over at any moment every single time you enable this feature. To pretend that Tesla drivers don't know this "because of false advertising" is just as fasle as the advertising itself.
No, the majority of crashes I’ve seen. Nowhere does the marketing say read your email and take you hands off the wheel.