• ℕ𝕖𝕞𝕠@slrpnk.net
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    I’m not my body and I’m not my mind. I am the ethical soul, the decision-making process. If the replacement makes all the same decisions I would, it IS me.

    • queermunist she/her@lemmy.ml
      link
      fedilink
      arrow-up
      0
      arrow-down
      5
      ·
      1 year ago

      What if something like ChatGPT is trained on a dataset of your life and uses that to make the same decisions as you? It doesn’t have a mind, memories, emotions, or even a phenomenal experience of the world. It’s just a large language data set based on your life with algorithms to sort out decisions, it’s not even a person.

      Is that you?

        • queermunist she/her@lemmy.ml
          link
          fedilink
          arrow-up
          0
          arrow-down
          5
          ·
          1 year ago

          I’m having a hard time imagining a decision that can’t be language based.

          You come to a fork in the road and choose to go right. Obviously there was no language involved in that decision, but the decision can certainly be expressed with language and so a large language model can make a decision.

            • queermunist she/her@lemmy.ml
              link
              fedilink
              arrow-up
              0
              arrow-down
              1
              ·
              edit-2
              1 year ago

              It doesn’t matter how it comes to make a decision as long as the outcome is the same.

              Sorry, this is beside the point. Forget ChatGPT.

              What I meant was a set of algorithms that produce the same outputs as your own choices, even though it doesn’t involve any thoughts or feelings or experiences. Not a true intelligence, just an NPC that acts exactly like you act. Imagine this thing exists. Are you saying that this is indistinguishable from you?