Closed adam-gaia closed 2 months ago
That's because there is a 10 seconds timeout configured for Ollama client that needs to be changed (or made configurable) https://github.com/danielmiessler/fabric/blob/a51a565cdc9ef20dcdeadac593dca8f374f593f1/vendors/ollama/ollama.go#L46
I changed the timeout in this file to 1200000 to make sure it never times out, but it still times out after 10 seconds.
Yeah, just something on the API side.
I would like to give a gentle nudge to reopen this issue.
After a small bit of testing, I am skeptical that ollama is closing the connection from its side.
More or less this is the test that I ran:
root@e1f29ec3d773:/go# time curl -v http://ollama:11434/api/generate -d '{
"model": "llama3.1:latest",
"prompt": "Why is the sky blue?",
"stream": false
}'
Here was the output:
root@e1f29ec3d773:/go# time curl -v http://ollama:11434/api/generate -d '{
"model": "llama3.1:latest",
"prompt": "Why is the sky blue?",
"stream": false
}'
* Trying 172.20.0.2:11434...
* Connected to ollama (172.20.0.2) port 11434 (#0)
> POST /api/generate HTTP/1.1
> Host: ollama:11434
> User-Agent: curl/7.88.1
> Accept: */*
> Content-Length: 87
> Content-Type: application/x-www-form-urlencoded
>
< HTTP/1.1 200 OK
< Content-Type: application/json; charset=utf-8
< Date: Mon, 19 Aug 2024 19:49:25 GMT
< Transfer-Encoding: chunked
<
{"model":"llama3.1:latest","created_at":"2024-08-19T19:49:25.358194669Z","response":"The sky appears blue to us during the day because of a phenomenon called blah blah blah...","done":true,"done_reason":"stop","context":[128009,numbers,numbers,numbers,13],"total_duration":37764246599,"load_duration":23378833,"prompt_eval_count":17,"prompt_eval_duration":107983000,"eval_count":379,"eval_duration":37591889000}
real 0m37.829s
user 0m0.011s
sys 0m0.007s
The curl request took >30 seconds.
A gently check-in, was the code re-compiled after changing the timeout to 1200000ms (20 minutes)? I know that I have made this type of mistake myself numerous times, especially with rebuilding docker containers after making a change.
Here is the environment that I tested...
Docker Compose for Ollama and Open WebUI:
services:
ollama:
volumes:
- ollama:/root/.ollama
container_name: ollama
pull_policy: always
tty: true
restart: unless-stopped
# image: ollama/ollama:${OLLAMA_DOCKER_TAG-latest}
image: ollama/ollama:latest
open-webui:
build:
context: .
args:
OLLAMA_BASE_URL: '/ollama'
dockerfile: Dockerfile
# image: ghcr.io/open-webui/open-webui:${WEBUI_DOCKER_TAG-main}
image: ghcr.io/open-webui/open-webui:latest
container_name: open-webui
volumes:
- open-webui:/app/backend/data
depends_on:
- ollama
ports:
# - ${OPEN_WEBUI_PORT-3000}:8080
- 3000:8080
environment:
- 'OLLAMA_BASE_URL=http://ollama:11434'
# - 'WEBUI_SECRET_KEY='
extra_hosts:
- host.docker.internal:host-gateway
restart: unless-stopped
volumes:
ollama: {}
open-webui: {}
Here is a super simple Dockerfile for Fabric (Dockerfile-fabric):
FROM golang:1.23.0
COPY ./brown-fox.txt /go/brown-fox.txt
COPY ./.env /root/.config/fabric/.env
COPY
#RUN go install github.com/danielmiessler/fabric@latest
#RUN fabric --setup
Here is the .env contents:
DEFAULT_VENDOR=Ollama
DEFAULT_MODEL=llama3.1
OLLAMA_API_URL=http://ollama:11434
OPENAI_API_KEY=NULL
GEMINI_API_KEY=
GROQ_API_KEY=
ANTHROPIC_API_KEY=
OPENAI_API_KEY=
YOUTUBE_API_KEY=my key
FWIW, the .env file is used to help get around a bug associated with entering the youtube key (https://github.com/danielmiessler/fabric/issues/832).
Contents of brown-fox.txt
The quick brown fox jumped over the lazy dog.
Docker commands:
$ docker build -t fabric -f Dockerfile-fabric .
$ docker container run --network ollama_default --rm -it fabric bash
Then when inside the docker container, run the following commands...
To install:
root@f57422cd8889:/go# go install github.com/danielmiessler/fabric@latest
Output:
go: downloading github.com/danielmiessler/fabric v1.4.2
go: downloading github.com/jessevdk/go-flags v1.6.1
go: downloading github.com/atotto/clipboard v0.1.4
go: downloading github.com/go-git/go-git/v5 v5.12.0
go: downloading github.com/otiai10/copy v1.14.0
go: downloading github.com/pkg/errors v0.9.1
go: downloading github.com/joho/godotenv v1.5.1
go: downloading github.com/samber/lo v1.47.0
go: downloading github.com/liushuangls/go-anthropic/v2 v2.6.0
go: downloading github.com/sashabaranov/go-openai v1.28.2
go: downloading github.com/google/generative-ai-go v0.17.0
go: downloading google.golang.org/api v0.192.0
go: downloading github.com/ollama/ollama v0.3.6
go: downloading golang.org/x/sys v0.24.0
go: downloading dario.cat/mergo v1.0.0
go: downloading github.com/ProtonMail/go-crypto v1.0.0
go: downloading github.com/go-git/go-billy/v5 v5.5.0
go: downloading github.com/sergi/go-diff v1.3.2-0.20230802210424-5b0b94c5c0d3
go: downloading github.com/emirpasic/gods v1.18.1
go: downloading golang.org/x/sync v0.8.0
go: downloading golang.org/x/text v0.16.0
go: downloading cloud.google.com/go/ai v0.8.0
go: downloading cloud.google.com/go v0.115.0
go: downloading github.com/googleapis/gax-go/v2 v2.13.0
go: downloading google.golang.org/genproto/googleapis/rpc v0.0.0-20240730163845-b1a4ccb954bf
go: downloading google.golang.org/grpc v1.64.1
go: downloading google.golang.org/protobuf v1.34.2
go: downloading golang.org/x/crypto v0.25.0
go: downloading github.com/cyphar/filepath-securejoin v0.2.4
go: downloading github.com/pjbgf/sha1cd v0.3.0
go: downloading github.com/jbenet/go-context v0.0.0-20150711004518-d14ea06fba99
go: downloading github.com/go-git/gcfg v1.5.1-0.20230307220236-3a3c6141e376
go: downloading cloud.google.com/go/longrunning v0.5.7
go: downloading google.golang.org/genproto/googleapis/api v0.0.0-20240711142825-46eb208f015d
go: downloading github.com/cloudflare/circl v1.3.7
go: downloading github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da
go: downloading github.com/kevinburke/ssh_config v1.2.0
go: downloading github.com/skeema/knownhosts v1.2.2
go: downloading github.com/xanzy/ssh-agent v0.3.3
go: downloading golang.org/x/net v0.27.0
go: downloading cloud.google.com/go/auth v0.8.1
go: downloading golang.org/x/oauth2 v0.22.0
go: downloading cloud.google.com/go/auth/oauth2adapt v0.2.3
go: downloading cloud.google.com/go/compute/metadata v0.5.0
go: downloading go.opencensus.io v0.24.0
go: downloading go.opentelemetry.io/contrib/instrumentation/google.golang.org/grpc/otelgrpc v0.51.0
go: downloading golang.org/x/time v0.6.0
go: downloading go.opentelemetry.io/contrib/instrumentation/net/http/otelhttp v0.51.0
go: downloading github.com/google/uuid v1.6.0
go: downloading gopkg.in/warnings.v0 v0.1.2
go: downloading github.com/google/s2a-go v0.1.8
go: downloading github.com/googleapis/enterprise-certificate-proxy v0.3.2
go: downloading go.opentelemetry.io/otel v1.26.0
go: downloading go.opentelemetry.io/otel/metric v1.26.0
go: downloading go.opentelemetry.io/otel/trace v1.26.0
go: downloading github.com/felixge/httpsnoop v1.0.4
go: downloading github.com/go-logr/logr v1.4.2
go: downloading github.com/go-logr/stdr v1.2.2
Even though we are supplying a .env file, setup still needs to be run to move the patterns into /root/.config/fabric/patterns:
root@f57422cd8889:/go# fabric --setup
Output:
[Azure]
Enter your Azure API KEY (leave empty to skip):
[Azure] skiped
[Ollama]
Enter your Ollama URL (as a reminder, it is usually http://localhost:11434) (leave empty for 'http://ollama:11434' or type 'reset' to remove the value):
[Ollama] configured
[Grocq]
Enter your Grocq API KEY (leave empty to skip):
[Grocq] skiped
[Gemini]
Enter your Gemini API KEY (leave empty to skip):
[Gemini] skiped
[Anthropic]
Enter your Anthropic API KEY (leave empty to skip):
[Anthropic] skiped
[OpenAI]
Enter your OpenAI API KEY (leave empty to skip):
[OpenAI] skiped
Available vendor models:
Ollama
[1] llama3.1:latest
[Default]
Enter the index the name of your default model (leave empty for 'llama3.1' or type 'reset' to remove the value):
1
DEFAULT_VENDOR: Ollama
DEFAULT_MODEL: llama3.1:latest
[YouTube]
Enter your YouTube API KEY (leave empty for 'my key' or type 'reset' to remove the value):
[Patterns Loader]
Enter the default Git repository URL for the patterns (leave empty for 'https://github.com/danielmiessler/fabric.git' or type 'reset' to remove the value):
Enter the default folder in the Git repository where patterns are stored (leave empty for 'patterns' or type 'reset' to remove the value):
Downloading patterns and Populating /root/.config/fabric/patterns..
FWIW, it would be amazing to have the ability to just download the patterns into /root/.config/fabric/patterns instead of having to run the fabric --setup
command. Perhaps there is a way to do this and I happened to miss it?
EDIT, I did miss it, fabric --updatepatterns
can be used directly instead of the fabric --setup
command. Woo hoo.
Now that everything is setup we can run our fabric command:
root@f57422cd8889:/go# time cat brown-fox.txt | fabric --stream --pattern extract_wisdom
Post "http://ollama:11434/api/chat": context deadline exceeded (Client.Timeout exceeded while awaiting headers)^C
real 0m10.404s
user 0m0.006s
sys 0m0.004s
Once the context deadline exceeded error returned, I hit ctrl + c, ending the program to try and get reasonable output for the time command.
I am unsure if this post helps or not, but I have been experiencing this problem for a couple of days and I am quite interested in using Fabric in the near future.
Also have this issue since migrating to new Go version.
This was fixed: https://github.com/danielmiessler/fabric/commit/9eb70b8d808b2beb6320fa60d072a170352a92e9
+1 @PickleOgre +2 For a great username.
This was fixed:
https://github.com/danielmiessler/fabric/commit/9eb70b8d808b2beb6320fa60d072a170352a92e9
+1 @PickleOgre
+2 For a great username.
That is indeed a fantastic username
What happened?
Hello, When I run the latest version of fabric (version
2.0
, built from the last commit to maina51a56
) with a large input from stdin the process reports a timeout error and hangs.For example, here is the summarize pattern, taking the fabric readme in:
The process hangs indefinitely after printing that message.
If I run with a smaller input from std in, it works as expected.
I'm running fabric with a local instance of ollama. I can confirm my instance of ollama is working correctly (plus the small inputs to fabric 2.0 worked).
Furthermore, I can run the same large-input test with the old python version of fabric and it works as expected.
A bit of a side note which probably doesn't matter, I ran the successful python with commit
053e97
which wasn't a tagged release. I hit issue #326 with the last python tag,v1.3.0
and to get around that looked for a commit after that issue was fixed.Some extra info in hopes it helps:
I don't yet know where the line is between a "large" or "small" input.
I haven't tried with any other llm providers like openai. If the source of my issue is not apparent, and it would help, I am happy to test with a different provider.
Fabric config
System info
I'm a new fabric user, but I'm really looking forward to trying it out!
Version check
Relevant log output
No response
Relevant screenshots (optional)
No response