Chainlit / chainlit

Build Conversational AI in minutes ⚡️
https://docs.chainlit.io
Apache License 2.0
7.09k stars 929 forks source link

Export ASGI `app` allowing for `app.mount()` instead of `mount_chainlit()` #1223

Open dokterbob opened 2 months ago

dokterbob commented 2 months ago

Is your feature request related to a problem? In order to include chainlit in a 3rd party FastAPI/ASGI/WSGI app, it's currently necessary to call mount_chainlit().

As this is not a common way to expose apps for mounting in other apps, it leads to confusion amongst developers. See #1166 and #1220.

Specifically, the current implementation requires that mount_chainlit() is called after any other endpoints are defined. As chainlit would be mounted on a subpath, this is not intuitive.

The current implementation also makes it impossible to mount chainlit multiple times (e.g. with different configuration, on different paths). Note that the current suggestion will not directly enable this, but it seems a logical step in the direction of supporting multiple chainlits.

Describe the solution you'd like Having a chainlit app which can either be imported from a module or dynamically generated with a factory function (get_app()), as an independent and self-contained ASGI-app.

This would include using the request path to derive the root path (rather than passing it hardcoded from mount_chainlit() as is currently done). This also seems a cleaner approach.

stephenrs commented 2 months ago

While I don't have enough direct experience with asgi to comment on specifics, I'll give anything that makes chainlit less of a bully a thumbs up :)

It seems the original underlying philosophical notion is that CL is going to be the only thing running on a server...it wants to do auth...it wants to terminate SSL...it wants you to open ports on your firewall and trust it (or let it take over 80/443)...and if I'm reading this and the related issues correctly, it wants to control routing. So, "Yes" to anything that makes CL play more nicely with others and be usable in a wider variety of environments.

...or maybe there's room here to consider splitting the project in to a "Chainlit" and a "Chainlit Server"...could I be the only one who just wants the core LLM/UI stuff and doesn't need any of the server/network-level elements?

In my case, since I didn't see any mention of Flask in the docs (and didn't get any tips in discord), I resorted to loading CL in an iframe from a Flask template. Ugly (but isolated).

tituslhy commented 2 months ago

I'd like to upvote this issue please. After mounting my chainlit app on FastAPI I'm also not too sure how to invoke it from a FastAPI sub-application endpoint perspective (https://fastapi.tiangolo.com/advanced/sub-applications/#check-the-automatic-api-docs). But I have verified that running 0.0.0.0:port_number/chainlit opens the standard Chainlit frontend.

stephenrs commented 2 months ago

I really hope some serious thought is being given to breaking CL free from the pristine localhost prison that it is currently in. Hard for folks to adopt something that is only comfortable in one deployment scenario.

For people building production-grade products and trying to incorporate CL, it would be helpful for some kind of vision statement or at least status/comment from @willydouhard. I'm personally finding that the beautiful website, docs, and marketing doesn't match the reality by a wide gap, unfortunately....and finding this out has been costly in my case. I would never have guessed that something that worked perfectly locally would be effectively impossible to migrate to a remote server environment. I've just never seen this before, and I've been building stuff for a long time.

And, I'll say that it's a bit tragic because CL is a beautiful piece of work at its core. But having it wrapped and buried beneath layers of networking and server complexity and pseudo-security that virtually no existing business is going to need (as far as I can tell) is just a shame. Hobby projects might be in a different category though, so who is the intended audience? What is CL? What is its mission?

No pressure, but I still intend to get a production release of CL/Copilot done in the immediate term. The decision I have to make is how much hacking of 1.1.402 I'll need to do to produce an MVP and whether that is going to mean I've forked too far from main to feasibly heal the separation any time soon.

I've been asked to provide feedback on some specific issues, which I've done as requested...any response?

stephenrs commented 1 month ago

@dokterbob Bump. Any response?

In general, when you ask someone to provide feedback on an issue, it is considered a sign of courtesy and respect to respond to them after they do.

stephenrs commented 1 month ago

@dokterbob Bump. Thoughts? Some important questions have been raised on this issue, so it's important that the community understands what you think of them.

Ahsan-Nayaz commented 3 weeks ago

Is this Issue still active? Very keen on this feature. @dokterbob any updates?