Closed faspie closed 6 months ago
you can also set the environment variable OPENAI_API_BASE: export OPENAI_API_BASE=http://someinternalhost.local/v1
Hi, thank you for the comments! I've added support for a custom API endpoint through an environment variable in my latest commit. Please let me know if you run into any issues!
Thanks, Sav
https://github.com/mudler/LocalAI
Support can simly be added by
The URL should be configured by the config.json file
"api_options": { "model": "whisper-1", "language": null, "temperature": 0.0, "initial_prompt": null **"base_api": "http://someinternalhost.local/v1"** },