FlowiseAI / Flowise

Drag & drop UI to build your customized LLM flow
https://flowiseai.com
Apache License 2.0
29.44k stars 15.23k forks source link

[FEATURE] support for lm studio #1276

Open pechaut78 opened 9 months ago

pechaut78 commented 9 months ago

Lm studio.is super easy to setup, and simpler than local ai.

It mimics openai api. Langchain supports it by passing a local base path..

Would bé wonderful to do thé same thing with flowise

jeffthorness commented 7 months ago

I'm with you and frankly a little puzzled why this isn't already supported.

SphaeroX commented 7 months ago

@dev: just load openai llm like that (python)

llm = ChatOpenAI(base_url="http://localhost:1234/v1")
KennyVaneetvelde commented 7 months ago

image Works perfectly fine like this for me, I am using LM Studio, I just have API-key set to "none" in the credentials, Note the IP address for you will probably just be localhost:1234 but I am running the docker container so to find LM studio I needed to use the IP address of the vEthernet adapter

pechaut78 commented 7 months ago

Excellent !!

Thanks a lot

Cordialement, CHAUT Pierre-Emmanuel


De : KennyVaneetvelde @.> Envoyé : vendredi 2 février 2024 15:18 À : FlowiseAI/Flowise @.> Cc : pechaut78 @.>; Author @.> Objet : Re: [FlowiseAI/Flowise] [FEATURE] support for lm studio (Issue #1276)

image.png (view on web)https://github.com/FlowiseAI/Flowise/assets/48944754/d356c9a8-6ab2-488c-b88b-0898e114f14c Works perfectly fine like this for me, I am using LM Studio, I just have API-key set to "none" in the credentials, Note the IP address for you will probably just be localhost:1234 but I am running the docker container so to find LM studio I needed to use the IP address of the vEthernet adapter

— Reply to this email directly, view it on GitHubhttps://github.com/FlowiseAI/Flowise/issues/1276#issuecomment-1923978328, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AAE2VARKD6C6NUEBPCRLJ43YRTYUNAVCNFSM6AAAAAA7YQB5COVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMRTHE3TQMZSHA. You are receiving this because you authored the thread.Message ID: @.***>

shiloh92 commented 7 months ago

Is there a website where they share workflows?

ArkMaster123 commented 5 months ago

@KennyVaneetvelde I have tried this but didn't work. I have actually a flowise instance on DO but LMStudio on my local laptop. Do i have to have both locally for this to work? would love a DO deployment solution! Thanks in advance

new4u commented 5 months ago

在 Docker 中,要让容器内的应用访问宿主机的 localhost,您可以使用几种方法。以下是一些常见的方法:

使用特殊的 DNS 名称: Docker 为容器提供了一个特殊的 DNS 名称 host.docker.internal,它指向宿主机的 IP 地址。您可以在容器内部使用这个名称来访问宿主机的服务。

所以我用 http://host.docker.internal:1234/v1 替换了 http://localhost:1234/v1

It works!

RicardoFernandez-UY commented 5 months ago

@KennyVaneetvelde Hi, sounds great! Where did you get the docker image? I wasn't able to find one in dockerhub, and I wouldn't know how to build it. Thank you!!

rachdeg commented 4 months ago

在 Docker 中,要让容器内的应用访问宿主机的 localhost,您可以使用几种方法。以下是一些常见的方法:

使用特殊的 DNS 名称: Docker 为容器提供了一个特殊的 DNS 名称 host.docker.internal,它指向宿主机的 IP 地址。您可以在容器内部使用这个名称来访问宿主机的服务。

所以我用 http://host.docker.internal:1234/v1 替换了 http://localhost:1234/v1

It works!

I tried this but unfortunately it didn't work. What did work was changing the localhost url to http://172.17.0.1:1234/v1 and it worked like a charm! Note that I'm using flowise in Docker and LM studio locally.

jan-wijman commented 1 month ago

Hi, Try tried the idea "KennyVaneetvelde commented on Feb 2 •", but the chat window is not rolling the message. The replay of the LLM in LLMStudio appears at ones. Although in LM studio I see the repons building up. What can be wrong in my Flowise diagram. image

edvinPL commented 6 days ago

Can you make this work by having the LM Studio locally, but flowise on Render and somehow connect the two?