• siderealyear@lemmy.world
    link
    fedilink
    English
    arrow-up
    48
    arrow-down
    2
    ·
    1 year ago

    Most of the internet was already BS before ‘working’ LLMs, where do you think the models learned it from? I think what you want is a crap detector, and I’m with you. Any ideas good ideas and I’ll donate my time to work on it.

  • monobot@lemmy.ml
    link
    fedilink
    arrow-up
    28
    ·
    1 year ago

    I think at some point we will have to introduce human confirmation from creator side.

    I don’t mind someone using chatgpt as a tool to write better articles, but most of internet is sensles bs.

  • Nawor3565@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    16
    ·
    1 year ago

    Unfortunately, even OpenAI themselves took down their AI detection tool because it was too inaccurate. It’s really, REALLY hard to detect AI writing with current technology, so any such addon would probably need to use a master list of articles that are manually flagged by human.

      • apis@beehaw.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Suspect it would operate more on the basis of a person confirming that the article is of reasonable quality & accuracy.

        So not unlike editors selecting what to publish, what to reject & what to send back for improvements.

        If good articles by AI get accepted & poor articles by people get rejected, there may still be impacts, but at face value it might be sufficient for us seeking to read stuff.

    • lily33@lemm.ee
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      That said, it should actually be possible to make a bullshit detector that detects bullshit writing.

  • Cwilliams@beehaw.org
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    I know there’s GPT Zero. I personally don’t trust it at all, but you could still look into it

  • apis@beehaw.org
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Have got fairly good at spotting these from the first few lines, but it would be nice to not bother clicking on them in the first place & better again if they didn’t clog up my search results.

    Back when it was just humans churning out rubbish, there was far less of it in the way of good information, but it helped enormously that search engines still respected operands.

    Bringing that back would likely help far more than a detector extension.

    • loki@lemmy.ml
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 year ago

      you can’t follow it but you can save it with the star icon and come back to it later.

      • Ategon@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Note the remindme bot uses an allowlist and this community isnt in it, youd have to get your community mods to request it gets added in the repository if you want to use it here

        • Nix@merv.news
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Why does it use an allowlist? Seems like its fine to just run across lemmy since it only appears when summoned.

          • Ategon@programming.dev
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            1 year ago

            Bot guidelines for some of the major instances dont allow bot posting unless its been approved by a mod. Also makes more sense for mods to choose what bots to allow in their community rather than response bots being fully allowed everywhere since that can easily get out of hand if a bunch get made

  • Cwilliams@beehaw.org
    link
    fedilink
    arrow-up
    3
    arrow-down
    5
    ·
    1 year ago

    Another thought: does it really matter if it’s AI generated or not? As long as you can fact-check the content and the quality isn’t horrible, I don’t see why it matters if it’s written by a real person or not