About 59% of Americans say TikTok a threat to the national security of the United States, according to a new survey of U.S. adults.

  • NightOwl@lemmy.one
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    2
    ·
    1 year ago

    I’ve come to the conclusion that it is algorithms that have become evil. There was a thread where someone was asking for help stoppinh YouTube from radicalizing their mother due to the videos it would suggest to her.

    I use stuff like newpipe and freetube to try and get away from these personalized attempts at content, since there is still good content on YouTube. It’s just that so many sites try and keep you there as long as possible and then start feeding you content that can warp people. But, algorithms don’t understand the impact of it, since it’s either a 0 or 1 of user stays or user leaves.

    • SCB@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      If you can be radicalized by videos from YouTube, it isn’t the algorithm, it’s you

          • NightOwl@lemmy.one
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            1 year ago

            World doesn’t exist in an individual vacuum. The people negatively influenced by disinformation go onto to take a role in society and interact with others to either negatively or positively affect the people they encounter. Congratulations on your individual resilience, but the world is not a population consisting of only you with you alone determining the impact other people have on the world.

            • SCB@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              edit-2
              1 year ago

              Yeah and, once again, those people are the problem.

              Unless you want to ban any food that isn’t fruits and vegetables, cars, not sleeping enough, not getting enough exercise etc, at some point you have to accept that people do in fact make their own choices.

              I’m not for banning things because some people are idiots.

              • surewhynotlem@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                you have to accept that people do in fact make their own choices.

                I feel bad that you’ve been radicalized into thinking this way.

              • hglman@lemmy.ml
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                1 year ago

                It’s not just all you or all youtube. Both matter. It’s harmfully reductionist to act like it, not both.

                • SCB@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  1 year ago

                  Both really don’t matter, since adults have a right to chose to consume any content they’d like.

                  If your grandma finds Q fascinating, that’s on your grandma

    • Stefen Auris@pawb.social
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      1 year ago

      algorithms can’t “become evil” any more than your toaster can. It’s being directed and programmed by people who know exactly what they’re intending to achieve.

      • NightOwl@lemmy.one
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        edit-2
        1 year ago

        But, algorithms don’t understand the impact of it, since it’s either a 0 or 1 of user stays or user leaves.

        It’s to say algorithms despite no intent to be evil have led to negative impact due to no care for the context of the recommendation. So someone can go in searching up health information then go down a rabbit hole of being recommended pseudo health advice then flat earth and so on. Not because the algorithm wants to turn people a certain way, but because it’s just recommending videos that users that liked similar videos might find of interest. It’s just broad categories.

        Wasn’t implying algorithms are sentient. At least not yet until AI integration happens.