• just another dev@lemmy.my-box.dev
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    2
    ·
    3 months ago

    I’m not sure how long ago that was, but LLM context sizes have grown exponentially in the past year, from 4k tokens to over a hundred k. That doesn’t necessarily affect the quality of the output, although you can’t expect it to summarize what it can’t hold on memory.