Context Rot: How increasing input tokens impacts LLM performance
Link: https://research.trychroma.com/context-rot
Discussion: https://news.ycombinator.com/item?id=44564248
Context Rot: How increasing input tokens impacts LLM performance
Link: https://research.trychroma.com/context-rot
Discussion: https://news.ycombinator.com/item?id=44564248
If you have a fediverse account, you can quote this note from your own instance. Search https://social.lansky.name/users/hn50/statuses/114854245586688823 on your instance and quote it. (Note that quoting is not supported in Mastodon.)