quarkiverse / quarkus-langchain4j

Quarkus Langchain4j extension
https://docs.quarkiverse.io/quarkus-langchain4j/dev/index.html
Apache License 2.0
133 stars 79 forks source link

Allow creating and launching an OpenWebUI container via Dev UI #600

Closed iocanel closed 3 months ago

iocanel commented 4 months ago

Resolves: #581

The pull request introduces the following:

iocanel commented 4 months ago

There is an issue with the volume handling. It does not appear if you've created the container manually, but will hit users that never used open-webui before and just tried to created from devui.

Converting to draft till I solve this one.

geoand commented 4 months ago

@iocanel is this ready or still in draft?

iocanel commented 4 months ago

@iocanel is this ready or still in draft?

This is ready

jmartisk commented 4 months ago

IIUC this just connects to the remote model provider directly by itself, so we can't have it working with our RAG mechanisms etc, right? Or could we make it work by perhaps, exposing the Ollama/OpenAI API by Quarkus as some kind of wrapper that adds our RAG and applies our configuration and having the OpenWebUI call the remote model through that?

iocanel commented 4 months ago

IIUC this just connects to the remote model provider directly by itself, so we can't have it working with our RAG mechanisms etc, right? Or could we make it work by perhaps, exposing the Ollama/OpenAI API by Quarkus as some kind of wrapper that adds our RAG and applies our configuration and having the OpenWebUI call the remote model through that?

Your understanding is correct, it just runs an Open WebUI instance that is aligned with the app in terms of model. Open WebUI does provide in its interface some options for configuring RAG, but I don't know if there is room for integration. It's definetely something we can explore in the future.

geoand commented 4 months ago

Should we merge this or are we waiting for some changes?

iocanel commented 4 months ago

Updated PR:

iocanel commented 4 months ago

For anyone trying it with ollama running locally don't forget to set OLLAMA_HOST=0.0.0.0 otherwise ollama won't be visible from the container.

geoand commented 4 months ago

I just built this locally but I don't see anything new the DevUI so I guess I am missing something :)

geoand commented 3 months ago

@iocanel any idea about ^?

iocanel commented 3 months ago

@iocanel any idea about ^?

The only thing I can think of is that either you sample is not using ollama, openai or its not using a SNAPSHOT. Let me try it again.

iocanel commented 3 months ago

After rebasing, I can no longer see anything myself

geoand commented 3 months ago

So it's not just me :)

iocanel commented 3 months ago

So it's not just me :) Yeah, the PR needed alignment with the new module structure. Fixed now.

iocanel commented 3 months ago

So it's not just me :)

Yeah, the PR needed alignment with the new module structure. Fixed now.

geoand commented 3 months ago

I just tried it and it woks nicely! The UI could certainly be improved, but I think we should merge now and improve later

geoand commented 3 months ago

cc @jmartisk