Hello and congratulations for this amazing project :)
I'm facing some issues while running on local models (actually not tested with non local).
The search allways works but the AI insights rarely works. What happens most of the time is that it starts to type but aventually, before finish, it throws an error like this: 500: Expecting ',' delimiter: line 1 column 7011 (char 7010)
It's allways a different column though.
I have tested with llama3 and mistral over ollama and I'm using searxng.
Hello and congratulations for this amazing project :)
I'm facing some issues while running on local models (actually not tested with non local). The search allways works but the AI insights rarely works. What happens most of the time is that it starts to type but aventually, before finish, it throws an error like this: 500: Expecting ',' delimiter: line 1 column 7011 (char 7010) It's allways a different column though. I have tested with llama3 and mistral over ollama and I'm using searxng.
docker-compose.yml: `services: backend: build: context: . dockerfile: ./src/backend/Dockerfile restart: unless-stopped ports:
"3005:3000" develop: watch:
searxng: container_name: searxngfarfalle image: docker.io/searxng/searxng:latest restart: unless-stopped ports:
.env `ENABLE_LOCAL_MODELS=True OLLAMA_HOST=http://192.168.60.234:11434 SEARCH_PROVIDER=searxng
SEARXNG_BASE_URL=http://searxng:8090
NEXT_PUBLIC_API_URL=http://192.168.60.260:8004 NEXT_PUBLIC_LOCAL_MODE_ENABLED=true`
Error Log: