mshumer / gpt-prompt-engineer

MIT License
9.39k stars 642 forks source link

Add LiteLLM support so that we can use different LLM for prompt generation #33

Open Greatz08 opened 8 months ago

Greatz08 commented 8 months ago

Instead of just open ai and claud ai support try to add LiteLLM( multi llm support foss solution )support to this project in such a way that we can add our local proxy server api endpoint which support either selfhosted open source llm or hosted open source like groq mistral/llama or proprietary LLM like Google Gemini and use that in this to generate prompt as per our need. I know performance might not be as good as of gpt 4 Still open source are capable to provide many times better solution than 3.5