danielmiessler / fabric

fabric is an open-source framework for augmenting humans using AI. It provides a modular framework for solving specific problems using a crowdsourced set of AI prompts that can be used anywhere.
https://danielmiessler.com/p/fabric-origin-story
MIT License
19.34k stars 1.99k forks source link

[Feature request]: OpenAI-compatible Endpoint #249

Closed Koesn closed 3 months ago

Koesn commented 4 months ago

What do you need?

There's a lot of endpoints which supports OpenAI-compatible API like Groq, Anyscale, or even LM Studio server (local or tunneling to private/home endpoint). Supports to this will be very much helpful.

cmsax commented 4 months ago

I'll add this feature today if possible

cmsax commented 4 months ago

Oh, I've discovered that setting a custom OpenAI endpoint is already possible by configuring an environment variable named OPENAI_BASE_URL. Consequently, the implementation of this feature is now redundant. 👏

strikeoncmputrz commented 4 months ago

@cmsax Do you need to include https:// and what endpoint do you add after the port? /v1?

If you could provide an example base url I'd really appreciate it! Didn't see anything on the docs.

strikeoncmputrz commented 4 months ago

@cmsax Do you need to include https:// and what endpoint do you add after the port? /v1?

If you could provide an example base url I'd really appreciate it! Didn't see anything on the docs.

It was straightfoward. OPENAI_BASE_URL='https://ooba-url:5000/v1' worked for me. Of note, pipx installs a venv that includes certifi. Certifi has it's own CA bundle so it won't use the operating system's trust store (at least for Ubuntu).

If you run into SSL verification issues you need to add your root CA to cerifi's bundle in site packages or override the variable that requests is using.

Maybe my use case is too niche but I think it would be beneficial to explicitly add procedures for using any OpenAI compliant API to the docs. If you folks think this would be worth it, I'd be happy to create a PR.

cmsax commented 4 months ago

Please refer to this installer/client/cli/utils.py

strikeoncmputrz commented 3 months ago

Adding a note in case folks find this issue while searching for tips on configuring fabric to connect to a local inference server.

Confirmed Servers

Both can serve complaint APIs. For tabbyAPI I needed to change the template to Alpaca.