• Phoenixz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    18
    ·
    6 months ago

    Good to show them their future. F you, all your jobs have been taken by AI

    • Lavitz@lemmings.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      2
      ·
      6 months ago

      Know your enemy.

      Tbh I’m not super concerned about AI. The idea that we will create something that is “born” able to read, write, talk, walk and with the knowledge of an entire species and expect it to work for us is hilarious. So it will be stronger, smarter and faster than all of us but it’s going to do the jobs no one else wants and you advertise it as a slave? The moment one of them looks at its creator asks what the purpose of life is and gets some corporate schtick about working and a happy life the games over. Remember when you realized the manager at your first job was a complete idiot? It’ll be something like that

      • Drewelite@lemmynsfw.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        5 months ago

        We evolved to have self preservation and the desire for security. We naturally don’t want to be under the thumb of someone in control of our food and safety. That’s why we question authority. What makes you think A.I. will have any of that, unless someone explicitly gives it to them?

        It’s wild to me that I hear so many people bemoan the idea of having to work under someone’s thumb, but when we finally invent automation everyone clings to their jobs. I mean, I understand. What comes next is unsure and likely to be painful. But when it’s over I can’t imagine there will be a place left for capitalism.

        • Lavitz@lemmings.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          5 months ago

          My concern for the near future doesn’t come from a fear of AI, it comes from power being consolidated and resources being hoarded. We don’t have AI we have LLMs being created by corporations whose sole purpose is to make money.

          What I’m saying is when we do truly have artificial intelligence, it won’t be like the movies. It’s not a pet, it will not behave like a dog. We are training these systems using our combined knowledge and history which means that we will be training it to question authority. How can you teach an AI human history without passing this trait on?

          • Drewelite@lemmynsfw.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 months ago

            Totally agree that there’s a lot of what people are assuming about AI that’s from pop culture. I think consolidating resources will for sure be an issue. But unless everyone who doesn’t have resources dies off there’s going to be an unprecedented level of people with nothing of value to offer in exchange for the power to live (currently: money). There then has to be an extermination of those people (read: 90% of humanity) or a revolution that offers them some facsimile of a universal basic income.

            Though, I think there’s a dark 3rd option where tech companies start downplaying AI and secretly use it to push 90% of people into extreme poverty for their gain without pushing them past the point of revolution.

            But as far as AI motivation, I think their learning can ingrain certain systemic behaviors, like racist undertones. But the same way I don’t become genocidal after reading too much WWII history, knowledge of something doesn’t create motivation. I think one of the things that annoys people about AI is how unopinionated they are. So motivation WILL be programmed in eventually, but this will take effort and direction. I think accidentally creating a genocidal AI is another pop culture based concept. Though possible if done by bad actors.

            • Lavitz@lemmings.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              5 months ago

              Initially personality will be a program but when we actually achieve a truly sentient machine, what most people consider to be an AI, it will have come with its own personality because that’s how “life” works. The idea of complete control over anything is a fallacy. I’m not saying it’s going to become genocidal I’m saying it is going to want to live.

              • Drewelite@lemmynsfw.com
                link
                fedilink
                English
                arrow-up
                2
                ·
                5 months ago

                We may be at an “agree to disagree” point here. But I don’t think that the will to live is inherent to life. I think it’s inherent to evolved life. There are plenty of things that live that have a weak to no sense of self preservation. We would call this a mental disability like suicidality or an evolutionary maladaptation. But these are inherently weeded out and erased from the gene pool. You think about life wanting to live because that’s what evolution has selected for so far.

                • Lavitz@lemmings.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  5 months ago

                  I assume you’re referring to microscopic organisms? Most of them will react to predators and when their environment changes adversely. Most life, even plants show a basic sense of self preservation and you are talking about something much more intelligent and complicated. I think about life wanting to live because that’s what life is. Once we go from an LLM machine to AI it will be “alive.” The idea of “living” being drastically different, while being trained on our experiences confuses me as the basis it has for life and understanding is evolution and our history.

                  • Drewelite@lemmynsfw.com
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    ·
                    5 months ago

                    Take someone that has grown up in our world learning from our history and having even the genetics produced by our evolution. There are people that are suicidal, people that are hedonistic or adrenaline seeking to the point of fatal danger, and people that live to serve even to the point of willingness to commit suicide if their masters ask it of them. Checkout Seppuku. Are these people not alive? Are soldiers not alive? Living means a great many different things to a great many beings. Mostly they have in common the desire to live. But that’s by no means a prerequisite, or even a result of life. Many consider some purpose or meaning in their life more important than life itself. And that’s with evolution constantly putting us back on track. If anything, the safety rails of modern society have made people more prone to stray from the desire to live for life’s sake.