ubergeek77's Lemmy
  • Communities
  • Create Post
  • Create Community
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
cm0002@lemmings.world to Funny@sh.itjust.works · 1 day ago

Welcome to our cyberpunk future

lemmy.ml

message-square
54
fedilink
883

Welcome to our cyberpunk future

lemmy.ml

cm0002@lemmings.world to Funny@sh.itjust.works · 1 day ago
message-square
54
fedilink
  • алсааас [she/they]@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    57
    arrow-down
    1
    ·
    1 day ago

    If you have legit delusions about chatbot romantic partners, you need therapy like a year ago

    • Aceticon@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      9 hours ago

      I don’t think therapy can cure Stupid.

    • TragicNotCute@lemmy.world
      link
      fedilink
      English
      arrow-up
      34
      arrow-down
      1
      ·
      1 day ago

      Like…AI therapy?

      • TheReturnOfPEB@reddthat.com
        link
        fedilink
        English
        arrow-up
        17
        ·
        edit-2
        21 hours ago

        we do ai couple’s ai therapy

        • Aceticon@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          9 hours ago

          According to the AI therapist both are “absolutelly right” even when contradicting each other.

    • Rhaedas@fedia.io
      link
      fedilink
      arrow-up
      20
      arrow-down
      1
      ·
      1 day ago

      If we had better systems in place to help everyone who needs it, this probably wouldn’t be a problem. Telling someone they need therapy isn’t helpful, it’s just acknowledging we aren’t aiding the ones who need it when they need it most.

      I’ll go further and say anyone who thinks any of these AI are really what they’re marketed as needs help, as in education of what is and isn’t possible. So that will cover all instances, not just the romantic variety.

    • Mk23simp@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      12
      ·
      1 day ago

      Careful, you should probably specify that therapy from a chatbot does not count.

      • DragonTypeWyvern@midwest.social
        link
        fedilink
        arrow-up
        9
        ·
        1 day ago

        “help I’ve fallen in love with my therapist!” recursive error

    • rnercle@sh.itjust.works
      link
      fedilink
      arrow-up
      4
      ·
      1 day ago

      The Daily: She Fell in Love With ChatGPT. Like, Actual Love. With Sex.

      article: https://www.nytimes.com/2025/01/15/technology/ai-chatgpt-boyfriend-companion.html

      podcast: https://www.nytimes.com/2025/02/25/podcasts/the-daily/ai-chatgpt-boyfriend-relationship.html

    • Endymion_Mallorn@kbin.melroy.org
      link
      fedilink
      arrow-up
      2
      ·
      1 day ago

      Odds are, people who have delusions about romantic partners thanks to the ELIZA effect are probably either too poor or would be resistant to getting professional help.

    • morrowind@lemmy.ml
      link
      fedilink
      arrow-up
      2
      arrow-down
      3
      ·
      22 hours ago

      I don’t think ppl with AI girlfriends have delusions of them being human or whatever. They know it’s AI, thought they may ascribe some human feeling that isn’t there

      But also, end of day, maybe it doesn’t matter to the as long as the model can still provide them emotional support

      • zalgotext@sh.itjust.works
        link
        fedilink
        arrow-up
        6
        ·
        19 hours ago

        There will come a time when your AI girlfriend’s context window fills up and its responses become increasingly unhinged and nonsensical. The average person doesn’t know to expect that though, so it probably is pretty harmful when someone’s emotional support robot suddenly goes insane

        • Test_Tickles@lemmy.world
          link
          fedilink
          arrow-up
          3
          arrow-down
          3
          ·
          18 hours ago

          There will come a time when your AI girlfriend’s context window fills up and its responses become increasingly unhinged and nonsensical.

          Wait… So I have already been dating AIs and the didn’t even know it? This explains a lot.

          • This is fine🔥🐶☕🔥@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            18 hours ago

            Have you asked her to do simple arithmetic calculations? LLMs can’t do that.

Funny@sh.itjust.works

funny@sh.itjust.works

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: [email protected]

General rules:

  • Be kind.
  • All posts must make an attempt to be funny.
  • Obey the general sh.itjust.works instance rules.
  • No politics or political figures. There are plenty of other politics communities to choose from.
  • Don’t post anything grotesque or potentially illegal. Examples include pornography, gore, animal cruelty, inappropriate jokes involving kids, etc.

Exceptions may be made at the discretion of the mods.

Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 1.95K users / day
  • 5.13K users / week
  • 9.22K users / month
  • 17.5K users / 6 months
  • 1 local subscriber
  • 12K subscribers
  • 2.29K Posts
  • 43.2K Comments
  • Modlog
  • mods:
  • TheDude@sh.itjust.works
  • kersploosh@sh.itjust.works
  • example@reddthat.com
  • VicksVaporBBQrub@sh.itjust.works
  • BE: 0.19.7
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org