Closed ghost closed 1 year ago
Glad to hear Chainlit fits your needs! I still remember our conversation and I did not forget about this issue 😄. We will dig into it and hopefully fix it soon!
I am in no hurry and this issue has been opened as suggested while other one was closed.
Could you try to:
chainlit hello
I would like to see if the rework fix those issues. Note that this version is not merged yet. The infos to install are at the end of the PR's description.
-w argument still doesn't work
Should be fixed in the latest version 0.3.0
. Please note that it contains breaking changes. We prepared a migration guide to make it easy for everyone.
-w argument works! but there's a problem with async while running llms locallly
With @cl.langchain_factory(use_async=True)
the following error appears
This means the LLM you are using does not have an async implementation (at the langchain level, not chainlit). Whenever possible you should use the async implementation, but in that case you will have to fallback to the sync implementation as it is the only one available. To do that change your factory with @cl.langchain_factory(use_async=False)
As title, it's an honor to start developing chat based UIs swiftly through chainlit, it would be great if -w argument starts working in the windows in the upcoming updates.