• mushroommunk@lemmy.today
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    1 day ago

    I think there’s a lot of solid arguments against letting AI steal everything, but with the scraping there’s an even more immediate problem. They don’t rate limit or do it in an intelligent method. It becomes a full blown ddos that has take down entire sites and slowed many more to the point of near uselessness.

    They’re in a very literal sense crashing large chunks of the Internet and causing havoc which costs very real money to fix, either by upping server resources or installing AI scraping mitigation resources so that every still has access to the free information you mention.

    • Rentlar@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      1 day ago

      That is definitely a problem that needs to be dealt with, since AI scrapers hogging bandwidth or making sites inaccessible means it is hampering equal access to everyone. Ignoring conventions and not rate limiting itself are harmful to the open internet.

      So yes, those kinds of AI scraping behaviours should be mitigated, but on the principle of AI ingesting my public data, I’m not against it, if it can access it reasonably and fairly like anyone else.