• suburban_hillbilly@lemmy.ml
    link
    fedilink
    arrow-up
    28
    ·
    6 months ago

    Bet you a tenner within a couple years they start using these systems as distrubuted processing for their in house ai training to subsidize cost.

    • 8ender@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      6 months ago

      That was my first thought. Server side LLMs are extraordinarily expensive to run. Download to costs to users.