A Tesla was in its self-driving mode when it crashed into a parked patrol vehicle responding to a fatal crash in Orange County Thursday morning, police said.

The officer was on traffic control duty blocking Orangethorpe Avenue in Fullerton for an investigation into a suspected DUI crash that left a motorcyclist dead around 9 p.m. Wednesday when his vehicle was struck.

A Fullerton Police Department spokesperson said the officer was standing outside his vehicle around midnight when he saw a Tesla driving in his direction and not slowing down.

  • Flying Squid@lemmy.world
    link
    fedilink
    arrow-up
    58
    arrow-down
    3
    ·
    10 months ago

    It really doesn’t help that the media isn’t putting “Self-Driving” Mode in quotes since it isn’t fucking self-driving.

    • FireRetardant@lemmy.world
      link
      fedilink
      arrow-up
      45
      ·
      10 months ago

      The victims involved in crashes aren’t always rich. People in other cars or pedestrians and cyclists can be injured by these mistakes.

    • garretble@lemmy.world
      link
      fedilink
      English
      arrow-up
      26
      arrow-down
      2
      ·
      edit-2
      10 months ago

      If only it were that simple. WE are all the test subjects in this case whether we like it or not.

  • BigMacHole@lemm.ee
    link
    fedilink
    arrow-up
    34
    arrow-down
    2
    ·
    10 months ago

    That must have been SO scary for the cop! He wouldn’t know whether to shoot the car or the passenger!

  • Wrench@lemmy.world
    link
    fedilink
    arrow-up
    19
    arrow-down
    2
    ·
    10 months ago

    Fuck Elon, and to a lesser extend, Tesla and all. But this seems like yet another user error on several accounts. I thought “autopilot” was only supposed to be used on freeways. And obviously assisted by a human who should have seen a fucking parked cop car coming and intercede anyway.

    But that said, fuck Elon and his deceptive naming of a fucking primitive tech that’s really only good at staying in a lane at speed under ideal conditions.

    • halcyoncmdr@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      3
      ·
      10 months ago

      I thought “autopilot” was only supposed to be used on freeways. And obviously assisted by a human who should have seen a fucking parked cop car coming and intercede anyway.

      It depends on which system they actually had on the vehicle. It’s more complicated than random people seem to think. But even with the FSD beta, it specifically tells the driver every time they activate it that they need to pay attention and are still responsible for the vehicle.

      Despite what the average internet user seems to think, not all Teslas even have the computer capable of Full Self Driving installed. I’d even say most don’t. Most people seem to think that Autopilot and FSD are the same, they’re not, and never have been.

      There have been 4+ computer systems in use over the years as they’ve upgraded the hardware and added capabilities in newer software. Autopilot, Enhanced Autopilot, and Full Self Driving BETA are three different systems with different capabilities. Anything bought prior to the very first small public closed beta of FSD a couple years ago would need to be replaced with a new computer to use FSD. Installation cost is included if someone buys FSD outright, or they have to pay for the upgrade if they instead want the subscription. All older Teslas however would be limited to Autopilot and Enhanced Autopilot without that computer upgrade.

      The AP and FSD systems are not at all the same, and they use different code. Autopilot is designed and intended for highways and doesn’t require the upgraded computer. Autopilot is and always has been effectively just Traffic Aware Cruise Control and Auto steer. Enhanced Autopilot added extra features like Summon, Auto lane change, Navigetc.ate on Autopilot (on-ramp to off-ramp navigation) but has never been intended for city streets. Autopilot itself hasn’t really been updated in years, almost all the updates have been to the FSD beta.

      The FSD beta is what is being designed for city streets, intersections, etc. and needs that upgraded computer to process everything for that in real time. It uses a different codebase to process data.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        edit-2
        10 months ago

        All cars since mid 2019 have the computer required for FSD.

        At this point that includes the majority of all Teslas ever sold. Somewhere between 750k and 800k of 6 million don’t have the hardware. And of those 100-200k are upgradeable, maybe more but the research time isn’t worth it.

        That being said, it still could have been AP and not FSD as the media gets it confused all the time.

        • halcyoncmdr@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          10 months ago

          Is that the actual time cutoff? My 2018 model 3 that came with Enhanced Autopilot was originally said to have the hardware necessary for FSD (Computer 2.5 the car says), but there were updates before FSD became actually available.

          I never considered buying it so I never paid more than cursory attention to all of the different hardware revisions, only major ones like Computer 3, removing radar during parts shortages around COVID, the Ultrasonic sensors, etc.

          Also I hadn’t realized that it had actually been that long since I bought it, without most of the regular time-based car maintenance like oil changes time has flown by with it. Or that production had ramped up so significantly since I got my Model 3. I knew it had ramped obviously and that the Model Y launched, but I didn’t realize how significant all of that actually was when added together.

          • NotMyOldRedditName@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            10 months ago

            Ya, it’s been that long and they’ve made a lot of cars since heh.

            I got mine in H1 2019 and it was HW 2.5, and sometime shortly after that HW3 came out. At the time I knew that was the situation but I wasn’t concerned since they said they’d upgrade it.

            It took awhile after HW3 came out to be offered the upgrade though. By the time I was eligible, we were in the peak of early covid lockdowns and I wasn’t traveling to the not so close service center for the upgrade.

            Eventually they did it via mobile service and I got it.

  • Buffalox@lemmy.world
    link
    fedilink
    arrow-up
    13
    arrow-down
    2
    ·
    edit-2
    10 months ago

    I just heard from Enron Musk that it crashed into the patrol car way more safely than a human would have done.
    Also according to Enron Musk Full self driving has been working since 2017, and is in such a refined state now, that you wouldn’t believe how gracefully it crashed into that patrol car. It was almost like a car ballet, ending in a small elegant piruette.

    As Enron Musk recently stated, in a few months we should have Tesla Robo Taxies in the streets, and you will be able to observe these beautiful events regularly yourself.

    Others say that’s ridiculous, he is just trying to save Enron, but that’s too late.

  • NameTaken@lemmy.world
    link
    fedilink
    arrow-up
    8
    arrow-down
    7
    ·
    10 months ago

    Ugh I know people feel strongly about FSD and Tesla. As some one who uses it ( and still pays attention hands on wheels when activated) when FSD is active as soon as it sees anything resembling emergency lights it will beep and clearly disengage. I am not sure, but it’s possible this person probably is just using Tesla as a scape goat for their own poor driving. However in my experience it will force the driver to take control when emergency lights are recognized specifically to avoid instances like this.

    • Joelk111@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      10 months ago

      Doesn’t Tesla usually look at the logs for a situation like this, so we’ll know shortly?

    • NotMyOldRedditName@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      10 months ago

      Assuming something was on, I’m not even convinced it was FSD and it could have easily been AP.

      The media and police get that wrong more often than right, and the article isn’t even specifically naming either one.

        • NotMyOldRedditName@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          10 months ago

          I can’t imagine a scenario where they’d be on FSD or AP pressing the accelerator AND looking at their phone.

          It’s one thing to press it because it’s hesitant or something but that would usually mean you’ve presses it because it’s not doing what you want which means you were watching.

          Him admitting he was on his phone (if truthful) would mean he was pressing the accelerator thus overriding the input AND not paying attention.

          It’s a stretch to far.

          If he lied about the phone to try and blame AP/FSD then that could make sense.

    • vxx@lemmy.world
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      10 months ago

      Thanks for the tip, going to flash my blue flashlight at teslas from now on.

      • NameTaken@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        2
        ·
        10 months ago

        Yeah sure if that’s what makes you happy… 👍. Nothing like blinding random people in cars in your spare time.

        • vxx@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          10 months ago

          No, not the driver, the faulty sensors and programming that should’ve never been approved for the road.

          • NameTaken@lemmy.world
            link
            fedilink
            arrow-up
            1
            arrow-down
            1
            ·
            10 months ago

            Wait so how is it faulty and bad programming if it disengages when emergency vehicles are present? You’d prefer it to stay on in emergency situations?

  • Sludgehammer@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    edit-2
    10 months ago

    IMHO it’s the flashing lights. I really think they overload the self driving software somehow and it starts ignoring changes in driving conditions (like say an emergency vehicle parked in the road).

    • Geyser@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      10 months ago

      I’ll bet you’re right that it’s the lights, but I don’t know about “overload” of anything.

      The problem with camera vision (vs human vision or LiDAR) is poor definition range. This means that pointing a light at it, like happens with emergency vehicle lights, can cause it to dim the whole image to compensate and then not see the vehicles. Same thing as when you take a backlit photo and can’t see the people.