Chainlit / chainlit

Build Conversational AI in minutes ⚡️
https://docs.chainlit.io
Apache License 2.0
6.24k stars 799 forks source link

Disable LLM Response Cache #107

Closed monuminu closed 1 year ago

monuminu commented 1 year ago

my call to LLM getting cached . How can i disable the same. This is causing a huge problem when i am developing my application.

willydouhard commented 1 year ago

If you are using langchain, responses are cached by default. You can add the --no-cache option to your chainlit run command to disable it.