LLocalSearch is a completely locally running search aggregator using LLM Agents. The user can ask a question and the system will use a chain of LLMs to find the answer. The user can see the progress of the agents and the final answer. No OpenAI or Google API keys are needed.
make dev
docker-compose -f ./docker-compose.dev.yaml build
[+] Building 12.6s (6/6) FINISHED docker:desktop-linux
=> [backend internal] load .dockerignore 0.0s
=> => transferring context: 2B 0.0s
=> [backend internal] load build definition from Dockerfile.dev 0.0s
=> => transferring dockerfile: 149B 0.0s
=> [backend internal] load metadata for docker.io/library/golang:alpine3 2.5s
=> [backend 1/3] FROM docker.io/library/golang:alpine3.19@sha256:cdc86d9 0.0s
=> CACHED [backend 2/3] WORKDIR /app 0.0s
=> ERROR [backend 3/3] RUN go install github.com/cosmtrek/air@latest 10.1s
[backend 3/3] RUN go install github.com/cosmtrek/air@latest:
10.10 go: github.com/cosmtrek/air@latest: module github.com/cosmtrek/air: Get "https://proxy.golang.org/github.com/cosmtrek/air/@v/list": net/http: TLS handshake timeout
failed to solve: process "/bin/sh -c go install github.com/cosmtrek/air@latest" did not complete successfully: exit code: 1
make: *** [build-dev] Error 17
make dev
docker-compose -f ./docker-compose.dev.yaml build [+] Building 12.6s (6/6) FINISHED docker:desktop-linux => [backend internal] load .dockerignore 0.0s => => transferring context: 2B 0.0s => [backend internal] load build definition from Dockerfile.dev 0.0s => => transferring dockerfile: 149B 0.0s => [backend internal] load metadata for docker.io/library/golang:alpine3 2.5s => [backend 1/3] FROM docker.io/library/golang:alpine3.19@sha256:cdc86d9 0.0s => CACHED [backend 2/3] WORKDIR /app 0.0s => ERROR [backend 3/3] RUN go install github.com/cosmtrek/air@latest 10.1s