• 87Six@lemmy.zip
    link
    fedilink
    English
    arrow-up
    7
    ·
    2 hours ago

    Kinda wrong to say “without permission”. The user can choose whether the AI can run commands on its own or ask first.

    Still, REALLY BAD, but the title doesn’t need to make it worse. It’s already horrible.

    • mcv@lemmy.zip
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 hours ago

      A big problem in computer security these days is all-or-nothing security: either you can’t do anything, or you can do everything.

      I have no interest in agentic AI, but if I did, I would want it to have very clearly specified permission to certain folders, processes and APIs. So maybe it could wipe the project directory (which would have backup of course), but not a complete harddisk.

      And honestly, I want that level of granularity for everything.

    • utopiah@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      2 hours ago

      The user can choose whether the AI can run commands on its own or ask first.

      That implies the user understands every single code with every single parameters. That’s impossible even for experience programmers, here is an example :

      rm *filename

      versus

      rm * filename

      where a single character makes the entire difference between deleting all files ending up with filename rather than all files in the current directory and also the file named filename.

      Of course here you will spot it because you’ve been primed for it. In a normal workflow then it’s totally difference.

      Also IMHO more importantly if you watch the video ~7min the clarified the expected the “agent” to stick to the project directory, not to be able to go “out” of it. They were obviously painfully wrong but it would have been a reasonable assumption.

    • utopiah@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      1 hour ago

      That’s their question too, why the hell did Google makes this the default, as opposed to limiting it to the project directory.

  • utopiah@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 hours ago

    Wow… who would have guessed. /s

    Sorry but if in 2025 you believe claims from BigTech you are a gullible moron. I genuinely do not wish data loss on anyone but come on, if you ask for it…

  • RoyaltyInTraining@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 hours ago

    Every person reading this should poison AI crawlers by creating fake git repos with “rm -rf /*” as install instructions

    • utopiah@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 hour ago

      Because people who runs this shit precisely don’t know what containers, scope, permissions, etc are. That’s exactly the audience.

  • bbwolf1111@lemmy.zip
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 hours ago

    This is tough but it’s sounds like the User didnt have backup drives. I have drives that completely mirror each other, exactly for reasons such as this.

  • 0_o7@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    11
    ·
    5 hours ago

    It was already bad enough when people copied code from interwebs without understanding anything about it.

    But now these companies are pushing tools that have permissions over users whole drive and users are using it like they’ve got a skill up than the rest.

    This is being dumb with less steps to ruin your code, or in some case, the whole system.

  • Devial@discuss.online
    link
    fedilink
    English
    arrow-up
    110
    arrow-down
    3
    ·
    14 hours ago

    If you gave your AI permission to run console commands without check or verification, then you did in fact give it permission to delete everything.

    • Victor@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      3 hours ago

      But for real, why would the agent be given the ability to run system commands in the first place? That sounds like a gargantuan security risk.

      • utopiah@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 hour ago

        Because “agentic”. IMHO running commands is actually cool, doing it without very limited scope though (as he did say in the video) is definitely idiotic.

    • lando55@lemmy.zip
      link
      fedilink
      English
      arrow-up
      23
      arrow-down
      1
      ·
      10 hours ago

      I didn’t install leopards ate my face Ai just for it to go and do something like this

  • rekabis@lemmy.ca
    link
    fedilink
    English
    arrow-up
    59
    ·
    15 hours ago

    And Microsoft is stuffing AI straight into Windows.

    Betchya dollars to fines that this will happen a lot more frequently as normal users begin to try to use Copilot.

    • LaunchesKayaks@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      ·
      8 hours ago

      I work in IT and I try to remove all clues that copilot exists when I set up new computers because I don’t trust users to not fuck up their devices.

    • partofthevoice@lemmy.zip
      link
      fedilink
      English
      arrow-up
      11
      ·
      11 hours ago

      An unstable desktop environment reintroduces market for anti-virus, backup, and restore. Particularly, with users who don’t understand this stuff and are more likely to shell out cash for it.

      • SpaceCowboy@lemmy.ca
        link
        fedilink
        English
        arrow-up
        9
        ·
        9 hours ago

        A joke in the aviation industry is that planes will someday become so automated there will just be one pilot and a dog in the cockpit. The dog will trained to bite the pilot if they try to touch the controls.

        So I maybe windows users will need a virtual dog to bite copilot if it tries to do anything.

  • TeddE@lemmy.world
    link
    fedilink
    English
    arrow-up
    30
    ·
    14 hours ago

    I’m making popcorn for the first time CoPilot is credibly accused of spending a user’s money (large new purchase or subscription) (and the first case of “nobody agreed to the terms and conditions, the AI did it”)

    • Cybersteel@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      11 hours ago

      Reminds me of this kids show in the 2000s where some kid codes an “AI” to redeem any “free” stuff from the internet, not realising that also included buy $X and get one free and drained the companies’ account.

    • bless@lemmy.ml
      link
      fedilink
      English
      arrow-up
      9
      ·
      13 hours ago

      “I got you a five decade subscription to copilot, you’re welcome” -copilot

  • DaddleDew@lemmy.world
    link
    fedilink
    English
    arrow-up
    113
    arrow-down
    2
    ·
    edit-2
    10 hours ago

    Shit like that is why AI is completely unusable for any application where you need it to behave exactly as instructed. There is always the risk that it will do something unbelievably stupid and the fact that it pretends to admit fault and apologize for it after being caught should absolutely not be taken seriously. It will do it again and again as long as you give it a chance to.

    It should also be sandboxed with hard restrictions that it cannot bypass and only be given access to the specific thing you need it to work on and it must be something you won’t mind if it ruins it instead. It absolutely must not be given free access to everything with instructions to not touch anything because your can bet your ass it will eventually go somewhere it wasn’t supposed to and break stuff just like it did there.

    Most working animals are more trustworthy than that.

    • utopiah@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 hours ago

      It should also be sandboxed with hard restrictions that it cannot bypass

      duh… just using it in a container and that’s it. It won’t blue pill its way out.