• MadMadBunny@lemmy.ca
    link
    fedilink
    arrow-up
    5
    arrow-down
    4
    ·
    4 hours ago

    Those AI all use dark patterns to keep people hooked. It’s literally a predictive Candy Crush or slot machine.

    These are dangerous.

    • reabsorbthelight@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      4 hours ago

      Do you know of any reports that AI companies did this intentionally? It would make business sense for them to make them addicting

      • MadMadBunny@lemmy.ca
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        4 hours ago

        No official report that I can link, though I recall seeing something at some point. I mostly speak from personal experience. I am a bit sensitive to these, so I noticed them right away. I barely use AI mostly because of the “hooking effect”. Some friends I know are downright addicted, it’s sad to see.

        • ElBarto@piefed.social
          link
          fedilink
          English
          arrow-up
          3
          ·
          3 hours ago

          I’m far from being an AI defender. And for the longest time I resisted the idea of vibe coding.

          I will give you that, without the right experience, vibe coding feels like gambling.

          But I learned rather quickly that you must first work on a dev plan with an LLM, and until that plan hasn’t covered every scenario, then you don’t move on.

          That yields much better results, and it has the advantage of having the blueprints for when you need the LLM to make changes.