logancyang / obsidian-copilot

A ChatGPT Copilot in Obsidian
https://www.obsidiancopilot.com/
GNU Affero General Public License v3.0
2.37k stars 161 forks source link

Feature Request: Be able to use OpenAI fine tuned model #136

Closed shuxueshuxue closed 6 months ago

Sokole1 commented 11 months ago

Hi @shuxueshuxue it should technically be possible by first setting OpenAI Proxy Base URL to: https://api.openai.com/v1 and LocalAI model to ft:gpt-3.5-turbo:my-org:custom_suffix:id. You can try this for now, but this isn't ideal. It might be good to implement a "Custom Model" command that allows you to specify certain settings like the model, base url, temperature, token limit, etc. Then be able to choose your custom model in the chat drop-down.

image

Since I don't have any fine-tuned models, below is an example where I set LocalAI model to gpt-3.5-turbo instead (notice how I selected LocalAI):

image
AMGMNPLK commented 10 months ago

Do you have in your roadmap having a local fine tuned model?