• 9point6@lemmy.world
    link
    fedilink
    arrow-up
    11
    arrow-down
    4
    ·
    9 months ago

    This looks more like a floating point issue than a mistake an LLM would make

    • Cosmicomical@kbin.social
      link
      fedilink
      arrow-up
      9
      ·
      9 months ago

      There are no LLMs involved in this picture, to train an llm you’d need 100x the training data. The panel is about a normal ML model.

      • Gaia [She/Her]@lemmygrad.ml
        link
        fedilink
        arrow-up
        3
        ·
        9 months ago

        The training data here is exaggerated more, actually. This task should take kilobytes, max, and would finish in a fraction of a second. Also, no self-respecting ML engineer would put together an ML system without accounting for every data type.

    • CodeMonkey@programming.dev
      link
      fedilink
      arrow-up
      10
      arrow-down
      2
      ·
      9 months ago

      But a floating point issue is the exact type of issue a LLM would make (it does not understand what a floating point number is and why you should treat them differently). To be fair, a junior developer would make the same type of mistake.

      A junior developer is, hopefully, being mentored by more senior coworkers who are extra careful with code reviews and would spot the bug for the dev. Machine generated code needs an even higher level of scrutiny.

      It is relatively easy to teach a junior developer to write code that is easy to read and conforms to the teams style guide.