yokingma / search_with_ai

🤖 Free Search with AI, 💡 Open-Source Perplexity, 📍 Support Ollama/SearXNG, Support Docker deployment. 让AI大模型和搜索引擎回答你的问题,支持本地大模型(Ollama)、聚合搜索引擎SearXNG,支持Docker一键部署。
https://isou.chat
MIT License
316 stars 57 forks source link

Error in displaying sources - failed to obtain information #3

Closed arsaboo closed 2 months ago

arsaboo commented 2 months ago

I was able to get it working using Ollama. I am getting the results, but the sources are not displayed.

image

Here's my .env

# Bing search key
BING_SEARCH_KEY=
# Google search key
GOOGLE_SEARCH_KEY=
GOOGLE_SEARCH_ID=
# aliyun key
ALIYUN_KEY=
# Yi Key
YI_KEY=
# google gemini
GOOGLE_KEY=
GOOGLE_PROXY_URL=
# baidu
BAIDU_KEY=
BAIDU_SECRET=
# tencent KEY:ID, SECRET:KEY
TENCENT_KEY=
TENCENT_SECRET=
# openai key
OPENAI_KEY=
# openai proxy
OPENAI_PROXY_URL=https://api.openai.com/v1
# moonshot
MOONSHOT_KEY=
# lepthon key
LEPTON_KEY=
# Local llm: Ollama host, default http://localhost:11434
OLLAMA_HOST=http://192.168.2.162:11434
# Free search engine: https://www.searx.space/
SEARXNG_HOSTNAME=http://localhost:8080
yokingma commented 2 months ago

Hi, Are you using Docker for deployment?

  1. please make sure SearXNG's API is working.
  2. make sure this project can access SearXNG's API.
  3. you may need to modify SEARXNG_HOSTNAME in .env if you are using docker to deploy this project. (please ensure that this Docker can access the host machine's network.
arsaboo commented 2 months ago

I am using docker for deployment and followed the process listed in the docs. What should the SEARXNG_HOSTNAME be? How do I test it

arsaboo commented 2 months ago

or...do I have to have to deploy searxng separately?

yokingma commented 2 months ago

@arsaboo yes... searxng-docker

arsaboo commented 2 months ago

@yokingma So, I am running searxng using the following:

version: "3.7"

services:
  redis:
    container_name: redis
    image: docker.io/library/redis:alpine
    command: redis-server --save 30 1 --loglevel warning
    networks:
      - searxng
    volumes:
      - redis-data:/data
    cap_drop:
      - ALL
    cap_add:
      - SETGID
      - SETUID
      - DAC_OVERRIDE

  searxng:
    container_name: searxng
    image: docker.io/searxng/searxng:latest
    networks:
      - searxng
    ports:
      - "8080:8080"
    volumes:
      - ./searxng:/etc/searxng:rw
    environment:
      - SEARXNG_BASE_URL=https://${SEARXNG_HOSTNAME:-localhost}/
    cap_drop:
      - ALL
    cap_add:
      - CHOWN
      - SETGID
      - SETUID
    logging:
      driver: "json-file"
      options:
        max-size: "1m"
        max-file: "1"

networks:
  searxng:
    ipam:
      driver: default

volumes:
  redis-data:

and I can access searxng using the IP address http://192.168.2.122:8080/

But I am still getting the same error. How do I change the SEARXNG_HOSTNAME?

It may be useful for you to create a complete docker compose file...to avoid such issues.

arsaboo commented 2 months ago

Same error even with:

version: "3.7"

services:
  redis:
    container_name: redis
    image: docker.io/library/redis:alpine
    command: redis-server --save 30 1 --loglevel warning
    volumes:
      - redis-data:/data
    cap_drop:
      - ALL
    cap_add:
      - SETGID
      - SETUID
      - DAC_OVERRIDE

  searxng:
    container_name: searxng
    image: docker.io/searxng/searxng:latest
    ports:
      - "8080:8080"
    volumes:
      - ./searxng:/etc/searxng:rw
    environment:
      - SEARXNG_BASE_URL=https://${SEARXNG_HOSTNAME:-localhost}/
    cap_drop:
      - ALL
    cap_add:
      - CHOWN
      - SETGID
      - SETUID
    logging:
      driver: "json-file"
      options:
        max-size: "1m"
        max-file: "1"

volumes:
  redis-data:
arsaboo commented 2 months ago

See here for an example of a combined docker-compose file or it will be great if you can provide an example.

yokingma commented 2 months ago

@arsaboo Thanks for your suggestion, I have added docker-compose file.

docker compose up -d
arsaboo commented 2 months ago

Sources are working now... but for some reason Ollama isn't working with the same URL (I do see Ollama running on that URL). I'll check and let you know

yokingma commented 2 months ago

@arsaboo Maybe the docker can't access the host's network?

yokingma commented 2 months ago
# wondows or mac
OLLAMA_HOST=http://host.docker.internal:11434
arsaboo commented 2 months ago

Ok...so it worked if I run the docker on the same computer where Ollama is running. I was unable to get Ollama working from another host. But I will close this issue as the original issue is fixed. I will create a few more issues[feature requests] for you to consider.