LLM Memory
I've been thinking about LLM memory since GPT3 came out.
Back then, my LLM side project was story generation (i.e. fiction). Context windows for LLMs were tiny back then: 4K tokens, input + output. So just a few pages of text. How to write a novella if your entire knowledge of the text is only a few pages, as if you were an amnesiac author?
Suppose you're writing a scene. Where does the scene take place? Your description must match any previous description of the same place and be consistent wit...
Read more at grantslatton.com