Closed alex13by closed 3 months ago
I think some users need freedom. The current approach allows users to fill in models from any OpenAI interface, or even create their own model APIs. Ollama models are also increasing day by day, and there are many, many ways to call ollama, and their model library is also extremely large. I don't think I can keep most models out of the LLM party for a little convenience. If you read my how_to_use.md carefully, you will find that there is a node called "Load Model Name" that allows you to load your configured model name from the party's config.ini file. This is my balance between freedom and convenience. If you think it is still not convenient enough, please come up with a better solution. The open source community is maintained by everyone and serves the entire community. I will not make other users inconvenient because of your personal convenience.
It is very inconvenient to fill in the manual, but also go to the directory to check whether there is this model