harry0703 / MoneyPrinterTurbo

利用AI大模型,一键生成高清短视频 Generate short videos with one click using AI LLM.
MIT License
17.15k stars 2.7k forks source link

ollama: failed to generate script: Connection error #388

Closed moritzrfs closed 5 months ago

moritzrfs commented 5 months ago

Running locally using ollama with the following settings:

ollama_base_url = "http://localhost:11434/v1/"
ollama_model_name = "llama3:8b"
openai_api_key = "123456"

returns

## generating video script
2024-05-26 20:40:06.423 | INFO     | app.services.llm:generate_script:223 - subject: beach
2024-05-26 20:40:06.423 | INFO     | app.services.llm:_generate_response:18 - llm provider: ollama
2024-05-26 20:40:08.952 | ERROR    | app.services.llm:generate_script:259 - failed to generate script: Connection error.
2024-05-26 20:40:08.952 | WARNING  | app.services.llm:generate_script:262 - failed to generate video script, trying again... 1

and docker logs

: _generate_response - llm provider: ollama
2024-05-26 22:40:08 webui  | 2024-05-26 20:40:08 | ERROR | "./app/services/llm.py:259": generate_script - failed to generate script: Connection error.
2024-05-26 22:40:08 webui  | 2024-05-26 20:40:08 | WARNING | "./app/services/llm.py:262": generate_script - failed to generate video script, trying again... 1
2024-05-26 22:40:08 webui  | 2024-05-26 20:40:08 | INFO | "./app/services/llm.py:18": _generate_response - llm provider: ollama

Is there anything wrong with the settings? Ollama is installed locally on the machine and not running in a container.

moritzrfs commented 5 months ago

Nevermind. Having ollama locally (outside the docker network) requires to access the hosts ollama server via host.docker.internal.

Problem solved.