noodlapp / noodl

Noodl is a low code platform for creating full stack web applications
https://noodl.net
GNU General Public License v3.0
379 stars 101 forks source link

[Feature Request] Allow us to use local llms instead of openai #35

Open Vigilence opened 8 months ago

Vigilence commented 8 months ago

I would love it if we could use our local llms instead of using openai. I can setup Lm studio as a server and use that instead of openai in other apps like flowise, autogen etc and would to be able to do it here as well.

erictuvesson commented 8 months ago

I haven't looked at LM Studio, but do they have the same request format as Open AI? I am not sure how Flowise and AutoGen would fit into this, since the AI model picked is very integrated.

Also the prompts have to be changed to work efficiently with any other LLM.

Vigilence commented 8 months ago

I believe so. Here is an example that they share for their server settings for chat completion.

# Example: reuse your existing OpenAI setup
from openai import OpenAI

# Point to the local server
client = OpenAI(base_url="http://localhost:1234/v1", api_key="not-needed")

completion = client.chat.completions.create(
  model="local-model", # this field is currently unused
  messages=[
    {"role": "system", "content": "Always answer in rhymes."},
    {"role": "user", "content": "Introduce yourself."}
  ],
  temperature=0.7,
)

print(completion.choices[0].message)