Closed BUJIDAOVS closed 8 months ago
My suggestion would be to take advantage of LLMs with MATLAB library that MatGPT is built on. You can fork it, and customize it to your needs.
The API URL, or endpoint, is defined in line 74 in +llms/+internal/callOpenAIChatAPI.m
END_POINT = "https://api.openai.com/v1/chat/completions";
You can change that to whatever the URL you want to use, then add your forked version of LLMs with MATLAB to the helpers folder in MatGPT.
I have an API URL with the same format as OpenAI's, and it's deployed locally. Due to confidentiality requirements, I don't want to use the OpenAI service. How can I redirect requests that would go to api.openai.com to my own local API URL instead?