Tesla FSD hits kids

From iGeek
Dan O'Dowd (Senatorial Candidate) claims FSD is unsafe, and uses a deceptive video to do it.
Dan O'Dowd (Senatorial Candidate) claims FSD is unsafe and uses a deceptive video to do it. He shows Tesla FSD hitting a kid-size mannequin. There are problems with the video and premise, and the person making the accusation like it appears FSD isn't engaged, he's created a scenario where it will fail, and likely has his foot on the gas overriding the car.
ℹ️ Info          
~ Aristotle Sabouni
Created: 2022-08-13 
Dr. Know-it-all Knows it all
YouTube Logo 2017.svg

Demonstrating why he believes the video was faked (or misleading).
🗒️ Note:
There are multiple versions of the video, that show slightly different things. One was older (used FSD but appears to be overriding the car). Another from the ad, was slightly different (was using autopilot). But they aren't offering the steps to reproduce the problem. The idea is not in fixing the software, but maligning the company.

There's a California Senatorial Candidate (Dan O'Dowd) working a campaign to spread a lie/exaggeration that Tesla FSD (Full Self Driving) doesn't detect kids on the road, based on a staged test with a mannequin. Facts to know:

  • The test wasn't using Tesla FSD, it was using a lesser technology called their advanced autopilot (more a fancy cruise control meant for highways, not private roads).
  • The test was designed for failure, with the cones preventing the car from swerving to go around.
  • The test was not objective, it was not done with those who could challenge it, it has not been repeated. And there's reason for suspicion.
  • Tesla allows manual override. If your foot is on the gas, then your foot overrides the autopilot. There appears to be that warning on the screen (that the driver is overriding the autopilot).
🗒️ Note:
Driver override is a useful feature. When ours sees a flashing yellow light, it thinks it's a street light going yellow (and soon red) so it starts slowing down/stopping. I tap the gas to override. European companies trust tech over people (tech wins arguments). American companies generally let the people win. Boeing/Airbus has the same thing... and a simple pitot tube failure (multiple freezing) meant an Airbus flew a plane into the Ocean, while the pilots were unable to stop it. The autopilot won, and only the crew and passengers died...
  • Multiple owners of FSD are trying to duplicate (and can't), and have not been seeing anything like that in the real world. (I drive the lesser auto-pilot as am not in the FSD beta, and I've had it stop/slow for pedestrians).
    • Some are claiming that Tesla's AI can somehow tell the difference between a child and a mannequin/cardboard. Maybe. But Occams razor, the simpler answer is more likely; like the driver is overriding the braking.
  • But there is not a case of any kid being harmed. If this was a real problem, and billions of miles have been covered with FSD, why is this just coming out now, with no real-world examples of lawsuits? Smells fishy.
  • Remember, there are warnings on the software that you are responsible for the car. It's an augmentation system so far, still in Beta, and it still does a few weird things. But the accident rates with it on are far lower than with it off. If they did find a failure (we don't think they did), that doesn't change the many cases where it helps/saves lives.
    • The detractors argument is like pointing out the truth that sometimes Airbags kill/hurt people. (They do). Most of the time they help, in a few cases they break necks or propel things into/through people. You're more likely to be saved than killed by one. Should the fact they might hurt some people (or not save all lives) prevent them from existing? If so, you still wouldn't have airbags, and more people would die while the technology was being perfected -- or never implemented because perfect is the enemy of good, or good enough, or better than nothing.
    • Tesla aggressively estimates that they save 40 accidents per day, just due to one common operator mistake. (Missing the brake and hitting the gas pedal instead / SUA -- sudden unintended accelleration. [1]
  • There are questions if the creator has investments in competing technologies (like LIDAR). There are claims, but haven't yet seen them verified.
  • There have been 30 crashes with Teslas and autopilot (not FSD) since 2016 (they had crossed a few billion autopilot miles in 2019). And 2.6M Teslas on the road (globally). To understand the context -- with autopilot on, Tesla averages about 4M miles per accident. With it off, about 1M miles per accident. The statistics aren't screaming that there's a huge risk here (that it's encouraging unsafe behavior, or that a lot of kids have died because of this). Thus this is more likely politics than a real problem.
    • NOTE: I'm not saying Tesla's data is the whole story. There can be some selection bias in what they're showing. For example, people activate FSD in places like freeways or once you're on a major road, and not parking lots. Most accidents are in parking lots. But the point is still that it's throwing big red flags on being unsafe. Which it likely would, if it was a killer.

The Dawn Project[edit | edit source]

Debunking the Dawn Project, which is Dan O'Dowd's company, that exists to seed doubt in Tesla's FSD and other Automation systems that don't use Dan O'Dowd as their consultant. To sum up, the basics seems to be, "Everything that doesn't use me, sucks".

Transport Evolved
YouTube Logo 2017.svg

Let's Shine Some Light On The Dawn Project
Whole Mars Catalog
YouTube Logo 2017.svg

Does Tesla Full Self-Driving Beta Really Run Over Kids? (YouTube pulled the video, but still available on Twitter
TechGeek Tesla
YouTube Logo 2017.svg

Is Tesla's FSD Beta Software SAFE? This video Settles it Once and for All - v10.12.2

GeekPirate.small.png



🔗 More

Tesla
Tesla is one of the most fascinating car companies and success stories in History.


🔗 Links

Tags: Tesla

Cookies help us deliver our services. By using our services, you agree to our use of cookies.