Context Rot: How increasing input tokens impacts LLM performance
Link: https://research.trychroma.com/context-rot
Discussion: https://news.ycombinator.com/item?id=44564248
Context Rot: How increasing input tokens impacts LLM performance
Link: https://research.trychroma.com/context-rot
Discussion: https://news.ycombinator.com/item?id=44564248
If you have a fediverse account, you can quote this note from your own instance. Search https://social.lansky.name/users/hn100/statuses/114854914269509248 on your instance and quote it. (Note that quoting is not supported in Mastodon.)