4

It has been suggested in the answer to this earlier question that it is just remembering a certain amount of recent information. The reference used is this post by OpenAI which says that ChatGPT should only be able to maintain a context of around 3000 words.

However, I've tested feeding it 10K words over multiple requests, and asking it to summarize all of it together, and it remembered the earlier parts of the conversation fine also.

The behavior seems beyond the normal behavior of GPT 3 which has an outright limitation on the amount of text that can be passed as input.

So, does anyone know how it is maintaining context? Is the model able to handle much larger inputs altogether with a per message limit on input, or are they processing it differently to enable retaining a larger context?

Kay999
  • 41
  • 1
  • 3
  • One would think that they aren't retaining the actual words, and are instead using some other way to represent the information that ChatGPT itself understands and can use to maintain context. So, where the individual inputs are limited, the overall context is not. – Kay999 Jan 09 '23 at 21:16
  • One possibly unrelated initiative for expanding GPT-3's limit is this project: https://gpt-index.readthedocs.io/en/latest/ That doesn't get around the 4K token limit, but it enables it to work off of more info. by giving GPT-3 relevant info in every request. – Kay999 Jan 10 '23 at 11:11

0 Answers0