devstein / langchain

⚡ Building applications with LLMs through composability ⚡
MIT License
1 stars 0 forks source link

Dynamic Docs Chunk: Solution for people who keep hitting "model's maximum context length" limit on chain_type refine #3

Open devstein opened 1 year ago

devstein commented 1 year ago

Feature request

Dynamic Docs Chunk: Solution for people who keep hitting the "model's maximum context length" limit on chain_type refine.

On refine type, we sequentially combine the existing answer + new context. There are times when the existing answer gets sufficiently big, that the existing answer + new context when combined exceeds the model's maximum context length.

Solution: Make it possible for the new context docs chunk to get resized based on the existing answer size.

Motivation

I have a use case, where I would list down the top suggestions for an author based on a stream of texts. I use the chain_type "refine" to generate these top suggestions. When I scale the amount of stream of texts that I processed, I keep hitting the "model's maximum context length" limit. Having these dynamic docs chunks that automatically resize the size of the next context according to the size of the existing answer would solve this.

Your contribution

I would love to contribute to making this feature a reality!