• mommykink@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    ·
    2 months ago

    I made a customized ChatGPT therapist that aligned with my values.

    “I made a hugbot that’s designed to never challenge or confront me and am now using it in place of a licensed and trained medical provider (and suggesting that other people should do the same!). Nothing can go wrong.”

    • theilleists@lemmy.world
      link
      fedilink
      arrow-up
      9
      ·
      2 months ago

      In my experience (which, to be fair, seems to be different from many people’s), it couldn’t be any worse than the real thing. 12 different licensed and trained medical providers each responded to my complaints about the ongoing traumas in my life with some variation of “Sure, but focus on the positives!” I’d have been better off saving the money and venting to a chatbot, if venting did anything for me.

      Please don’t tell me to see a 13th. I’m completely done with the idea.

  • 200ok@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    2 months ago

    I asked chatGPT what some of the risks are associated with using it in place of a certified therapist. This was the point I found most salient:

    Ethical Concerns: ChatGPT is not bound by the same ethical guidelines as therapists, which include confidentiality, handling crises, and ensuring patient well-being.