Does a license like this exist?

  • pulsewidth@lemmy.world
    link
    fedilink
    arrow-up
    18
    ·
    10 hours ago

    Literally any license that says any derivatives of your work must include attribution, bonus points if you use a license that says derivative works must be shared under the same license, eg GPL or MIT.

    The AI Bros will take it anyway though and ignore your license, and the courts are very pro-business and AI is like half the US economy at this point, so it’s probably all pretty pointless.

    They’re currently being sued for just that in Doe’s v. GitHub et al, which has already been going on for years. Currently its waiting to be scheduled in the 9th circuit, they’ve already been waiting 18 months.

  • BrikoX@lemmy.zip
    link
    fedilink
    English
    arrow-up
    27
    ·
    11 hours ago

    There are various licenses that do that, but by definition they are not open source. Those two things are mutually exclusive.

  • asudox@lemmy.asudox.dev
    link
    fedilink
    arrow-up
    15
    ·
    11 hours ago

    I found this, which adds additional text to the existing licenses to prohibit training an AI on the licensed code: https://github.com/non-ai-licenses/non-ai-licenses

    Though, per OSI’s definition, your code probably would no longer be open source, since training an LLM is technically considered a field of endeavour:

    OSD number 6:

    The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.

    • Orygin@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      6 hours ago

      If we can play semantics, the program (the compiled binary) can be used for anything with no field restrictions.
      But the code is not the program itself, it’s the recipe, and usage could be restricted in some specific ways.
      In my opinion, since free licenses already have restrictions regarding distribution, saying AI models trained on this data are derivative works and must be licensed compatible (ie training data set, methods and models themselves being free).
      I feel it’s a better middle ground where the freedom of users are not violated nor restricted, and the code/knowledge stays free

    • Venia Silente@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      10 hours ago

      We need a new field of licensing, something like Ethical Source License. With AI being a thing on-the-field, and even before tbh, Open Source has alas become a paradigm of the past.

    • hperrin@lemmy.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 hours ago

      Maybe you could say that AI training is not a use of the program, it is a use of the source code.

  • voluble@lemmy.ca
    link
    fedilink
    arrow-up
    2
    ·
    7 hours ago

    CC BY-SA is this, in spirit.

    This license enables reusers to distribute, remix, adapt, and build upon the material in any medium or format, so long as attribution is given to the creator. The license allows for commercial use. If you remix, adapt, or build upon the material, you must license the modified material under identical terms.

    See: https://creativecommons.org/licenses/by-sa/4.0/

    There are other flavours of Creative Commons licenses that might suit you better. Anyway, worth looking into.

  • poVoq@slrpnk.net
    link
    fedilink
    arrow-up
    12
    arrow-down
    1
    ·
    11 hours ago

    The problem is that the AI companies claim they are just “reading” the code to let their AI models learn from it, thus licenses / copyright doesn’t apply in their interpretation of the situation.

    • cat_fishing@feddit.onlineOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 hours ago

      The interesting thing is that from my testing with very niche code, it downright copied the only online example (my repo.) It even reused my variable names.

      • poVoq@slrpnk.net
        link
        fedilink
        arrow-up
        7
        ·
        10 hours ago

        Yeah, but you (or someone in a similar situation) will have to take them to court for the legal situation to be clarified. Just putting another license on the code will not stop them.

  • FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    6
    arrow-down
    4
    ·
    7 hours ago

    It may be that such a license can’t exist. The way these viral copyleft licenses work is that they offer things to people who accept them that copyright otherwise doesn’t permit. The usual example: you can distribute copies of this work (a thing that copyright prohibits you from doing by default) but in exchange you must release any derivative works you make under the same licence.

    The problem is that you actually can reject that licence. You can download the software (that’s allowed because the person distributing the software agreed to the copyleft license) and then decide you’re not going to accept the license that came with it. At that point you’re restricted by ordinary copyright and can only do the things you’d normally do with it.

    There have already been court cases in the US that have ruled that training an AI is fair use, and the resulting model is not a derivative work covered by the copyright of the original. So you can just go ahead and train the AI at that point.

    • XLE@piefed.social
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      7 hours ago

      (Facedeer is a prolific pro-AI concern troll from Reddit, so take his post here with a huge grain of salt)

          • FaceDeer@fedia.io
            link
            fedilink
            arrow-up
            1
            arrow-down
            3
            ·
            7 hours ago

            Thanks. This has actually been a thing that bothered me many years before AI was ever a thing, there are open source programs I’ve installed that pop open a clickwrap “agree with the GPL before you can install this” step and it shows a misunderstanding of how these licenses fundamentally work. They’re not EULAs.

            As for whether I’m a “concern troll”, AI happens to be an area of significant interest to me right now and so I’ve been commenting a lot on it. My opinion on it also happens to be unpopular. I don’t like the idea of closed social media bubbles where only groupthink is allowed, so I just go ahead and speak my mind even knowing it’ll likely get hit with a lot of downvotes. I’m finding the Fediverse to be a lot more insular than Reddit is in this regard, I suspect because the population in general is a lot smaller, but at least downvotes don’t tend to “bury” comments.

            If anyone can’t stand reading my comments, I recommend blocking me. It’s the ultimate downvote.

  • slazer2au@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    11 hours ago

    Licensing only works as well as enforcing it. How do you show a LLM consumed your code as part of its training data?

    • lobut@lemmy.ca
      link
      fedilink
      arrow-up
      5
      ·
      10 hours ago

      Some authors typed the first few sentences of their book and the LLM spit out the rest.

      • FaceDeer@fedia.io
        link
        fedilink
        arrow-up
        4
        arrow-down
        2
        ·
        7 hours ago

        That generally only happens in cases of overfitting, where the model was trained on a poorly de-duplicated data set that contains many copies of that book (or excerpts, quotes, and so forth). This is considered a flaw by AI trainers and a lot of work goes into sanitizing the training data to prevent it.

          • FaceDeer@fedia.io
            link
            fedilink
            arrow-up
            1
            arrow-down
            2
            ·
            7 hours ago

            You went digging through my Reddit comments to find a two-month-old thread, that must have taken a lot of effort. But I’m afraid I don’t see what the relevance of it is, aside from a general “it’s about AI”. The bulk of the comments I wrote there were about water usage.

            I’m genuinely puzzled. Are you saying that deduplicating data is “hiding unethical behaviour?” It’s actually intended for improving the model’s performance, having a model spit out exact copies of its training data means you’ve produced a hugely expensive and wasteful re-implementation of copy-and-paste rather than a generative AI. The whole point of generative AI is to produce novel outputs.

  • DandomRude@piefed.social
    link
    fedilink
    English
    arrow-up
    6
    ·
    10 hours ago

    The thing about licenses is that they only work if they can be defended in court. In the US system in particular, it is simply impossible for a private individual to do so (even multi-billion-dollar corporations with their highly paid lawyers seem to be powerless against artificially inflated AI giants such as OpenAI).

    Therefore, it must be assumed that even restrictive licenses will simply be disregarded.

  • hperrin@lemmy.ca
    link
    fedilink
    English
    arrow-up
    5
    ·
    10 hours ago

    Technically, a lot of them already disallow this without maintaining copyright notices, which the AI companies aren’t doing. I doubt the AI companies would follow the new license, but at least it would be even more explicitly illegal.

  • Alex@lemmy.ml
    link
    fedilink
    arrow-up
    5
    ·
    11 hours ago

    Freedom 1 of the four software freedoms is:

    The freedom to study how the program works, and change it to make it do what you wish (freedom 1). Access to the source code is a precondition for this.

  • TomAwezome@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    11 hours ago

    As nice as this would be, it’s not very likely… Licenses are usually limp suggestions from the perspective of companies with billions of dollars. AI companies train on millions of copyrighted materials, both literature and art, without any express permission from the authors or artists, and with essentially no recourse or compensation to the authors. You could append a ‘no AI training’ clause to an existing license like the MIT license, but the impact that will have will mostly be brief personal satisfaction and won’t change what the AI companies do. It’s genuinely more useful to keep code proprietary to prevent it from being used to train AI models.

  • Lembot_0006@programming.dev
    link
    fedilink
    arrow-up
    6
    arrow-down
    6
    ·
    11 hours ago

    You can create your own license with whatever idiotic limitations you wish.

    Example: Litsenzy – zis code for no bad peple, no robots and no to that ugly shit who stepped on my foot yesterday in ze tram.