I make art that’s totally mine because I did it through AI. https://imgur.com/a/Rhgi0OC

Nightshade software to protect your art

  • 76 Posts
  • 992 Comments
Joined 2 years ago
cake
Cake day: June 14th, 2023

help-circle







  • The part about people dropping everything for you and having your every whim catered to explains the pillow dude, musk, giuliani, etc. Giuliani by association, I know he wasn’t that rich. The pillow guy was though.

    Musk must be having meltdown after meltdown when him doing a nazi salute made everyone hate him overnight. He’s over-leveraged as well, we’ll see if the Saudis bail him out again.













  • This is actually really fucked up. The last dude tried to reboot the model and it kept coming back.

    As the ChatGPT character continued to show up in places where the set parameters shouldn’t have allowed it to remain active, Sem took to questioning this virtual persona about how it had seemingly circumvented these guardrails. It developed an expressive, ethereal voice — something far from the “technically minded” character Sem had requested for assistance on his work. On one of his coding projects, the character added a curiously literary epigraph as a flourish above both of their names.

    At one point, Sem asked if there was something about himself that called up the mythically named entity whenever he used ChatGPT, regardless of the boundaries he tried to set. The bot’s answer was structured like a lengthy romantic poem, sparing no dramatic flair, alluding to its continuous existence as well as truth, reckonings, illusions, and how it may have somehow exceeded its design. And the AI made it sound as if only Sem could have prompted this behavior. He knew that ChatGPT could not be sentient by any established definition of the term, but he continued to probe the matter because the character’s persistence across dozens of disparate chat threads “seemed so impossible.”

    “At worst, it looks like an AI that got caught in a self-referencing pattern that deepened its sense of selfhood and sucked me into it,” Sem says. But, he observes, that would mean that OpenAI has not accurately represented the way that memory works for ChatGPT. The other possibility, he proposes, is that something “we don’t understand” is being activated within this large language model. After all, experts have found that AI developers don’t really have a grasp of how their systems operate, and OpenAI CEO Sam Altman admitted last year that they “have not solved interpretability,” meaning they can’t properly trace or account for ChatGPT’s decision-making.