• Zwuzelmaus@feddit.org
    link
    fedilink
    English
    arrow-up
    63
    arrow-down
    1
    ·
    1 day ago

    And these foreign crowd workers know the local traffic rules? Maybe they even have regular drivers licenses?

    • Chozo@fedia.io
      link
      fedilink
      arrow-up
      40
      ·
      1 day ago

      This used to be my job. They’re not controlling the cars. They’re basically completing real-time CAPTCHAs, telling the car whether the cameras see a stop sign, a bicycle, temporary barriers, etc. If the car can’t identify an object that could possibly cross its path, it pulls over and stops until an operator can do a sanity-check on whatever the car’s confused by. They only need to be able to identify objects on the road, not know the rules of the road.

    • Perspectivist@feddit.uk
      link
      fedilink
      English
      arrow-up
      56
      arrow-down
      1
      ·
      1 day ago

      I think the interventions here are more like: “that’s a trash can someone pushed onto the road - let me help you around it” rather than: “let me drive you all the way to your destination.”

      It’s usually not the genuinely hard stuff that stumps AI drivers - it’s the really stupid, obvious things it simply never encountered in its training data before.

      • MoffKalast@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        ·
        edit-2
        1 day ago

        Saw this blog post recently about waymo’s sim setup for generating synthetic data and they really do seem to be generating pretty much everything in existence. The level of generalization of the model they seem to be using is either shockingly low or they abort immediately at the earliest sign of high perplexity.

        • Kushan@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          22 hours ago

          I’m guessing it’s the latter, they need to keep accidents to a minimum if they’re ever going to get broad legislation to legalise them.

          Every single accident is analysed to death by the media and onlookers alike, with a large group of people wanting it to fail.

          This is a prime example, we’ve known about the human intervention for a while now but period people seem surprised that those people are in another country.

      • Zwuzelmaus@feddit.org
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        edit-2
        1 day ago

        it’s the really stupid, obvious things

        Hm. Interesting. But that makes them look even mode incapable than I feared.

        • Perspectivist@feddit.uk
          link
          fedilink
          English
          arrow-up
          14
          arrow-down
          4
          ·
          1 day ago

          Broadly speaking, an AI driver getting stumped means it’s stuck in the middle of the road - while a human driver getting stumped means plowing into a semi truck.

          I’d rather be inconvenienced than killed. And from what I’ve seen, even our current AI drivers are already statistically safer than the average human driver - and they’re only going to keep getting better.

          They’ll never be flawless though. Nothing is.

          • MrScottyTay@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            12
            arrow-down
            2
            ·
            1 day ago

            Ai drivers have run over and crushed people slowly before too though because they didn’t see the person as an “obstacle” to be avoided, or because they were on the ground, it didn’t see them

            • Perspectivist@feddit.uk
              link
              fedilink
              English
              arrow-up
              9
              arrow-down
              4
              ·
              1 day ago

              And they always will. You need to look at the big picture here, not individual cases. If we replaced every single car on US roads with one driven by AI - proven to be 10 times better a driver than a human - that would still mean 4,000 people getting killed by them each year. That, however, doesn’t mean we should go back to human drivers and 40,000 people killed annually.

              • ltxrtquq@lemmy.ml
                link
                fedilink
                English
                arrow-up
                27
                arrow-down
                2
                ·
                edit-2
                1 day ago

                You need to look at the big picture here, not individual cases.

                By that logic…

                We should really be investing in trains and buses, not cars of any type.

                • walden@wetshav.ing
                  link
                  fedilink
                  English
                  arrow-up
                  7
                  arrow-down
                  9
                  ·
                  1 day ago

                  I think your logic is flawed. The discussion is about a specific form of transportation. By your own logic, you should be suggesting that people fly everywhere.

                  • Clent@lemmy.dbzer0.com
                    link
                    fedilink
                    English
                    arrow-up
                    3
                    ·
                    23 hours ago

                    Yes. AI human transformation drones make far more sense. Much easier to avoid things because airspace can be controlled. Just need to figure out how to do efficiently that the ride is more than 5 minutes.

                  • ltxrtquq@lemmy.ml
                    link
                    fedilink
                    English
                    arrow-up
                    4
                    arrow-down
                    2
                    ·
                    1 day ago

                    For long distance maybe, but immediately saying we should all fly everywhere because it has the fewest deaths per passenger mile would really not be looking at the big picture.

              • zbyte64@awful.systems
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                17 hours ago

                Big picture is AI not being able to operate under unusual conditions means that the “10 times better” (if it were only true) has a big fucking caveat where we can’t say the stat will hold true if we replace all drivers.

                • RobotToaster@mander.xyz
                  link
                  fedilink
                  English
                  arrow-up
                  12
                  ·
                  1 day ago

                  Tesla made the idiotic decision to rely entirely on cameras, waymo used lidar and other sensors to augment vision.

                • Pennomi@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  8
                  ·
                  1 day ago

                  That’s Tesla, not Waymo. Tesla’s hardware is shit and does not even include lidar. You can’t judge the entire industry by the worst example.

                  • Perspectivist@feddit.uk
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    arrow-down
                    4
                    ·
                    edit-2
                    1 day ago

                    New HW4 Teslas do in fact include a front-facing radar, but it’s currently only used for collecting data - not for FSD.

                    Still, gotta give them credit for getting by with vision-only quite well. I don’t personally see any practical reason why you absolutely must include LiDAR. We already know driving relatively safely with vision only is possible - all the best drivers in the world do it.

          • Zwuzelmaus@feddit.org
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            4
            ·
            1 day ago

            current AI drivers are already statistically safer than

            As long as they use level 3 autonomous cars and then cheat with remote operators instead of using real level 5 cars, such statistics remain quite meaningless.

            However, they tell about the people who use them as arguments.

            • errer@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              1 day ago

              As the OP stated, the low velocity cases are not causing deadly accidents. And you can’t drive by wire at high speed (too much latency). So I doubt it’s affecting the stats in any meaningful way.

              Honestly I much prefer they have a human as a backup than not.

              • [deleted]@piefed.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                24 hours ago

                As the OP stated, the low velocity cases are not causing deadly accidents.

                Make humans drive as slow as these cars and deaths will drop too.

      • Zwuzelmaus@feddit.org
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        1 day ago

        No. I am not from there. Feel free to explain what is possible.

        In my country we have a law that requires such remote operators to have a license that is valid here.

        (Sadly, we do not require them to reside here)

        • whereIsTamara@lemmy.org
          link
          fedilink
          English
          arrow-up
          2
          ·
          18 hours ago

          So if a business has AI drive a car, but then AI hands it over to a human who has no drivers license in the location, they are essentially allowing someone to operate a vehicle without a license, who is not even inside the country. If that car crashes into someone, Waymo has to explain why they let someone wildly unqualified and unlicensed operate for them. That’s millions in damages for gross neglect.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        15 hours ago

        This is how it generally behaves, but they are capable of taking direct control in more difficult situations. It’s only very slow maneurvers though, it’s not like they would be driving it down the street. They could move it off the road onto the shoulder though if needed.

        Edit: I am trying to find the source, but having problems. It was only ever mentioned in 1 official waymo document that I’ve seen that it was technically possible. My guess is they say their remote helpers can’t / don’t do it because they truly can’t, and it’s some highly restricted type of person who can, who isn’t classified like these other employees. The whole misleading but technical true kinda speak. I’ll keep looking though because I was really surprised to see them admit it when I saw it in an official document.

        Found it

        https://www.cpuc.ca.gov/-/media/cpuc-website/divisions/consumer-protection-and-enforcement-division/documents/tlab/av-programs/tcp0038152a-waymo-al-0003_a1b.pdf

        In very limited circumstances such as to facilitate movement of the AV out of a freeway lane onto an adjacent shoulder, if possible, our Event Response agents are able to remotely move the Waymo AV under strict parameters, including at a very low speed over a very short distance.

        Looks like I was right as well on terminology, it’s not the remote operators that can do it, it’s the “Event Response” team that can.

        As far as I know this is the only official acknowledgement it’s possible. Everywhere else they say it isn’t, and this is a footnote in that document.

        • Zwuzelmaus@feddit.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          8 hours ago

          at a very low speed over a very short distance.

          LOL so when they get in a situation in a tunnel that is 10 or 20 km long (ok you have them only 4km in poor Usa, but we have them here), they first drive it at 10km/h and then they give up after 300m? Because the rules are the rules??

          • NotMyOldRedditName@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            8 hours ago

            From the description it’s really not meant to solve that. In a situation like that they’d have to send someone, but they would be able to get out of the middle of a lane, off to the side, even if that only gives an extra foot or two of space to pass the vehicle.

            Edit: And that’s assuming their remote helpers couldn’t direct the car to drive itself out using their other tool where the AI drives itself with their suggestions.

      • [deleted]@piefed.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        24 hours ago

        That is like the person steering to avoid a collision while cruise control and lane assist are on, it isn’t actually fully autonomous.