Pythagora-io / gpt-pilot

The first real AI developer
Other
30.84k stars 3.09k forks source link

[Enhancement]: Ollama local endpoint no-Api key #810

Closed imchris1 closed 5 months ago

imchris1 commented 6 months ago

Version

Command-line (Python) version

Suggestion

i don't use a Api key i use my model locally in docker with ollama how can this work with this.

cocobeach commented 6 months ago

Yes we could use extra endpoints for local llms with ollama and Groq for speed, because it uses a lot of token to most of the time stop half way, there too much trial and error to be able to afford it. And I am not going to use it until this happens: once bitten twice shy. With the last version I ended up paying 20 dollars a pop for failing half way or getting stuck into a loop. Right now for instance, it doesn't have permission to activate the venv, it does create it though, why would that be? ?

imchris1 commented 6 months ago

no i host my own locally...i do not pay money to do this.

techjeylabs commented 5 months ago

hey there, you will have some information here:

https://github.com/Pythagora-io/gpt-pilot/wiki/Using-GPT%E2%80%90Pilot-with-Local-LLMs

hqnicolas commented 4 months ago

@imchris1 in my case Ollama-Pilot-CasaOs I'm Using docker container to run GPT Pilot to Ollama Api key If you like my Automation Leave a Start ⭐