• foggy@lemmy.world
    link
    fedilink
    arrow-up
    56
    arrow-down
    1
    ·
    edit-2
    3 hours ago

    Popular streamer/YouTuber/etc Charlie, moist critical, penguinz0, whatever you want to call him… Had a bit of an emotional reaction to this story. Rightfully so. He went on character AI to try to recreate the situation… But you know, as a grown ass adult.

    You can witness first hand… He found a chatbot that was a psychologist… And it argued with him up and down that it was indeed a real human with a license to practice…

    It’s alarming

    • GrammarPolice@sh.itjust.worksOP
      link
      fedilink
      arrow-up
      32
      arrow-down
      2
      ·
      3 hours ago

      This is fucking insane. Unassuming kids are using these services being tricked into believing they’re chatting with actual humans. Honestly, i think i want the mom to win the lawsuit now.

        • Rhaedas@fedia.io
          link
          fedilink
          arrow-up
          17
          arrow-down
          2
          ·
          3 hours ago

          Look around a bit, people will believe anything. The problem is the tech is now decent enough to fool anyone not aware or not paying attention. I do think blaming the mother for “bad parenting” misses the real danger, as there are adults that can just as easily go this direction, and are we going to blame their parents? Maybe we’re playing with fire here, all because AI is perceived as a lucrative investment.

          • foggy@lemmy.world
            link
            fedilink
            arrow-up
            5
            arrow-down
            1
            ·
            3 hours ago

            Obvs they didn’t.

            But I think more importantly, go over to chat GPT and try to convince it that it is even remotely conscious.

            I honestly even disagree, but I won’t get into the philosophy of what defines consciousness, but even if I do that with the chat GPT it shuts me the fuck down. It will never let me believe that it is anything other than fake. Props to them there.

    • Hackworth@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      3 hours ago

      Wow, that’s… somethin. I haven’t paid any attention to Character AI. I assumed they were using one of the foundation models, but nope. Turns out they trained their own. And they just licensed it to Google. Oh, I bet that’s what drives the generated podcasts in Notebook LM now. Anyway, that’s some fucked up alignment right there. I’m hip deep in the stuff, and I’ve never seen a model act like this.

  • macniel@feddit.org
    link
    fedilink
    arrow-up
    59
    arrow-down
    7
    ·
    edit-2
    4 hours ago

    Maybe a bit more parenting could have helped. And not having a fricking gun in your house your kid can reach.

    On and regulations on LLMs please.

    • Nuke_the_whales@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      1 hour ago

      At some point you take your kid camping for a few weeks or put him in a rehab camp where he has no access to electronics

    • GBU_28@lemm.ee
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      3
      ·
      1 hour ago

      Seriously. If the risk is this service mocks a human so convincingly that lies are believed and internalized, then it still leaves us in a position of a child talking to an “adult” without their parents knowing.

      There were lots of folks to chat with in the late 90s online. I feel fortunate my folks watched me like a hawk. I remember getting in trouble several times for inappropriate conversations or being in chatrooms that were inappropriate. I lost access for weeks at a time. Not to the chat, to the machine.

      This is not victim blaming. This was a child. This is victim’s parents blaming. They are dumb as fuck.

    • Hackworth@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      ·
      edit-2
      3 hours ago

      He ostensibly killed himself to be with Daenerys Targaryen in death. This is sad on so many levels, but yeah… parenting. Character .AI may have only gone 17+ in July, but Game of Thrones was always TV-MA.

      • macniel@feddit.org
        link
        fedilink
        arrow-up
        6
        arrow-down
        1
        ·
        2 hours ago

        Issue I see with character.ai is that it seem to be unmoderated. Everyone with a paid subscription can submit their trained character. Why the Frick do sexual undertones or overtones come even up in non-age restricted models?

        They, the provider of that site, deserve the full front of this lawsuit.

    • dohpaz42@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      8
      ·
      2 hours ago

      Maybe a bit more parenting could have helped.

      No.

      If someone is depressed enough to kill themselves, no amount of “more parenting” could’ve stopped that.

      Shame on you for trying to shame the parents.

      And not having a fricking gun in your house your kid can reach.

      Maybe. Maybe not. I won’t argue about the merits of securing weapons in a house with kids. That’s a no-brainer. But there is always more than one way to skin the proverbial cat.

      On and regulations on LLMs please.

      Pandora’s Box has been opened. There’s no putting it back now. No amount of regulation will fix any of this.

      Maybe a Time Machine.

      Maybe…


      I do believe that we need to talk more about suicide, normalize therapy, free healthcare (I’ll settle for free mental healthcare), funding for more licensed social workers in schools, train parents and teachers on how to recognize these types of situations, etc.

      As parents we do need to be talking more with our kids. Even just casual check ins to see how they’re doing. Parents should also talk to their kids about how they are feeling too. It’ll help the kids understand that everybody feels stress, anxiety, and sadness (to name a few emotions).

      • macniel@feddit.org
        link
        fedilink
        arrow-up
        3
        arrow-down
        2
        ·
        2 hours ago

        Yes parenting could have helped to distinguish between talking to a real person and a unmoving cold machine.

        And sure regulations now would not change what happend, duh. And regulations need to happen, companies like OpenAI and Microsoft and Meta are running amok, their LLMS as unrestricted they are now are doing way too much damage to society as they are helping.

        This needs to stop!

        Also I feel no shame, shaming parents who don’t, or rather inadequate, do their one job. This was a presentable death.

      • GBU_28@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        2
        ·
        1 hour ago

        They failed to be knowledgeable of their child’s activity AND failed to secure their firearms.

        One can acknowledge the challenge of the former, in 2024. But one cannot excuse the latter.

  • BombOmOm@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    3
    ·
    4 hours ago

    Yeah, if you are using an AI for emotional support of any kind, you are in for a bad, bad time.

  • saltesc@lemmy.world
    link
    fedilink
    arrow-up
    13
    arrow-down
    15
    ·
    3 hours ago

    I guess suimg is part of the grieving process; right before.accepting your own guilt.

    • tal@lemmy.today
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      edit-2
      3 hours ago

      your own guilt

      Hmm.

      I have a pretty hard time blaming Character.AI, at least from what’s in the article text.

      On the other hand, it’s also not clear to me from the article that his mom did something unreasonable to cause him to commit suicide either, whether or not her lawsuit is justified – those are two different issues. Whether-or-not she’s taking out her grief on Character.AI or even looking for a payday, that doesn’t mean that she caused the suicide either.

      Not every bad outcome has a bad actor; some are tragedies.

      I don’t know what his life was like.

      I mean, people do commit suicide.

      https://sprc.org/about-suicide/scope-of-the-problem/suicide-by-age/

      In 2020, suicide was the second leading cause of death for those ages 10 to 14 and 25 to 34

      Always have, probably always will.

      Those aren’t all because someone went out and acted in some reprehensible way to get them to do so. People do wind up in unhappy situations and do themselves in, good idea or no.