• @[email protected]
    link
    fedilink
    English
    51 month ago

    I’d suspect the low “density” of context makes it prone to hallucinations. You need to load in 3000 lines to express what Python does in 3, so there’s a lot of chances to guess the next token wtong.

    • Sckharshantallas
      link
      fedilink
      English
      21 month ago

      I was gonna say that, probably the higher the abstraction level the best it is for LLMs to reason about the code, because once learned it’s less tokens.