• Cosmic Cleric@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    2
    ·
    edit-2
    6 months ago

    I think it will be impossible for us to asses how much it actually impacts function in real world use case.

    Does seem fair though to say that if you have 85% less data input/probes, that you’re losing some to a large amount of fidelity, than an algorithm can only make up so much for.

    A potentionally bad analogy, but think of it as a high bitrate versus a low bitrate, for listening to music. The quality of the music will be notably different, but you would still be able to hear both of the songs in their entirety.

    At the end of the day, it’s a lack of data that was originally expected for the algorithm to work with, that is now missing.

    Anti Commercial-AI license (CC BY-NC-SA 4.0)

    • NotMyOldRedditName@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      6 months ago

      Currently they’ve been having him control a cursor. He can left and right click it.

      He can perform as good as he could before the problem now it seems, so if that’s the case that extra bit rate wasn’t needed for that task.

      What this probably means is he won’t be able to do as much as they learn more about it.

      Maybe 1 year in the 2nd patient with full fidelity is able attach it to a robotic arm and fetch themselves a drink, but Nolan while he can click as good, won’t be able to do that.

      Also if they fix it eventually, as they didn’t say never, just not yet, they’ll never know if that discrepancy occurs.