Well, every website I find is either crashing either not working on mobile.

  • Angelevo@feddit.nl
    link
    fedilink
    arrow-up
    8
    arrow-down
    1
    ·
    10 hours ago

    Others have already mentioned the tools.

    This is a losing battle. The systems will continue to improve, obfuscation will always lag behind. My advice would be to learn to embrace it. Zen meditation can help.

    Imaginably, this response may be very annoying; seeing the current trajectory of development, it is hard to imagine you can. Not sharing is the only defeat, which also defeats the purpose of art. Quite a conundrum.

      • canofcam@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        5 hours ago

        I feel your response warrants quite a deep and philosophical response in return.

        No reason to share my drawings

        That depends why you were sharing them in the first place. It does suck that your content will be stolen and potentially recreated in the future - I have to ask, how do you foresee that affecting you?

        Let’s go back to 2010, you share your drawings, some people see and enjoy them, and the world keeps spinning. Is that what you want to share them for, for people to enjoy them?

        If that’s the case, then my question is: Why does that change if AI is scraping your drawings?

        People will still see and enjoy your drawings, regardless of whether AI consumed them or not.

        it will only be good to the AI corporations

        Why will it only be good to the AI corporations? That implies that it wouldn’t be good to anyone anyway, in which case, there is no point posting them either way - but I don’t think that’s true. I think it’s good for you to share your drawings and get feedback, and it’s good for other people to see them. The addition of AI is a negative one, but it doesn’t remove the good, it just adds some bad.

  • Lumidaub@feddit.org
    link
    fedilink
    arrow-up
    25
    ·
    16 hours ago

    If you’re mobile-only, that’s probably going to be a problem.

    The only tools I know are Glaze and Nightshade. Think of them as defense and offense, respectively. They both introduce subtle changes to your image that make it harder for AI to use it. Well I say subtle - the stronger the protection the more noticeable the effect is to the human eye.

    Glaze is defense, meaning AI can’t use the image to imitate your style. I don’t personally use it because I’m simply not well-known enough for anyone to want to specifically imitate my nonsense.

    Nightshade is offense, it poisons the AI’s data with useless information. Basically it makes the AI think that an image of, say, a cat is a chimney. That’s what I use on anything I upload.

    You can use both, though I don’t remember if there’s a recommended order in which to use them. The effect will then also be more noticeable. If you decide to use one or both, you should play around with the settings to see what’s tolerable to you and what makes sense for an individual image.

    Last time I looked, they were working on a mobile solution but I haven’t heard anything on that front, unfortunately.

    • BakedCatboy@lemmy.ml
      link
      fedilink
      English
      arrow-up
      12
      ·
      13 hours ago

      I saw this note on the nightshade website:

      For now, if you want to shade your own art, you should Nightshade it first, then Glaze it for protection.

      I think it would be neat if people could donate computer time to artists to run nightshade, but it doesn’t look like there’s a docker image with a web UI I could host.

      • Lumidaub@feddit.org
        link
        fedilink
        English
        arrow-up
        5
        ·
        13 hours ago

        Oh nice, I seem to have missed that.

        Donating computer time would be a fantastic idea.

  • TachyonTele@piefed.social
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    1
    ·
    16 hours ago

    Post them privately, or not at all.
    Especially if you want to sell anything. Then only post your art in that store.

  • turdas@suppo.fi
    link
    fedilink
    arrow-up
    9
    ·
    edit-2
    16 hours ago

    Not an expert on this topic but I’ve read about it a fair bit and tinkered around with image generators:

    You don’t post them, basically. Unfortunately nothing else will really work in the long term.

    There are various tools – Glaze is the first one I can think of – that try to subtly modify the pixels in the image in a way that is imperceptible to humans but causes the computer vision part of image generator AIs (the part that, during the training process, looks at an image and produces a text description of what is in it) to freak out and become unable to understand what is in the image. This is known as an adversarial attack in the literature.

    The intention of these tools is to make it harder to use the images for training AI models, but there are several caveats:

    • Though they try to be visually undetectable to humans, they can still create obviously visible artifacts, especially on higher strength levels. This is especially noticeable on hand-drawn illustrations, less so on photographs.
    • Lower strength levels with fewer artifacts are less effective.
    • They can only target existing models, and even then won’t be equally effective against all of them.
    • There are ways of mitigating or removing the effect, and it will likely not work on future AI models (preventing adversarial attacks is a major research interest in the field).

    So the main thing you gain from using these is that it becomes harder for people to use your art for style transfer/fine-tuning purposes to copy your specific art style right now. The protection has an inherent time limit in it because it relies on a flaw in the AI models, which will be fixed in the future. Other abusable flaws will almost certainly remain and be discovered after the ones currently used are fixed, but the art you release now obviously cannot be protected by techniques that do not yet exist. It will be a cat-and-mouse game, and one where the protection systems play the role of the cat.

    Anyway, if you want to try it, you can find the aforementioned Glaze at https://glaze.cs.uchicago.edu/. You may want to read one of their recent updates, which discusses at greater length the specific issue I bring up here, i.e. the AI models overcoming the adversarial attack and rendering the protection ineffective, and how they updated the protection to mitigate this: https://glaze.cs.uchicago.edu/update21.html

  • GrantUsEyes@lemmy.zip
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    17 hours ago

    I know about this tool, but haven’t used it myself

    https://glaze.cs.uchicago.edu/

    Glaze is a system designed to protect human artists by disrupting style mimicry. At a high level, Glaze works by understanding the AI models that are training on human art, and using machine learning algorithms, computing a set of minimal changes to artworks, such that it appears unchanged to human eyes, but appears to AI models like a dramatically different art style.

  • Denjin@feddit.uk
    link
    fedilink
    arrow-up
    2
    arrow-down
    4
    ·
    edit-2
    10 hours ago

    I have an idea: if every artist/photographer etc uploaded their work with a watermark across it that said “f*ck you” then the LLMs that steal it would all start telling their users “f*ck you” as well.

      • Denjin@feddit.uk
        link
        fedilink
        arrow-up
        1
        ·
        4 hours ago

        Let’s say the LLM is looking to replicate a watercolour, if it’s entire database of images tagged as “watercolour” have a giant f you emblazoned across it, the LLM just thinks that’s what a watercolour is. They have no understanding of what the image is or means or anything. People call it artificial intelligence but it’s the antithesis of intelligent.