coleam00 / bolt.new-any-llm

Prompt, run, edit, and deploy full-stack web applications using any LLM you want!
https://bolt.new
MIT License
3.41k stars 1.43k forks source link

Error processing request #297

Open ARIS-2004 opened 3 days ago

ARIS-2004 commented 3 days ago

**Is your feature request related to a problem? problem showing "There was an error processing your request: No details were returned " in the console its showing "hook.js:608 Warning: Encountered two children with the same key, cohere/command. Keys should be unique so that components maintain their identity across updates. Non-unique keys may cause children to be duplicated and/or omitted — the behavior is unsupported and could change in a future version. Error Component Stack

i added a .env file and then ran it but still...

Describe the solution you'd like:

Describe alternatives you've considered:

Additional context:

amehat commented 19 hours ago

same issue

swan01 commented 8 hours ago

I'm also experiencing the same issue. I performed a clean install on the machine, and even installed a brand new instance of the app on another computer! Any help would be gratefully appreciated. Thanks.

amehat commented 8 hours ago

I'm adding screenshots to help with resolution

bolt-error-1 bolt-error-2 bolt-error-3

And my docker-compose :

services:
  ollama:
    image: ollama/ollama:latest
    container_name: ollama
    volumes:
      - ollama:/root/.ollama
    ports:
      - 11434:11434
    restart: unless-stopped
    networks:
      - ai

  open-webui:
    #build:
    #  context: .
    #  args:
    #    OLLAMA_API_BASE_URL: '/ollama/api'
    #  dockerfile: Dockerfile
    image: dyrnq/open-webui:latest
    container_name: open-webui
    volumes:
      - open-webui:/app/backend/data
    depends_on:
      - ollama
    ports:
      - 8080:8080
    environment:
      - 'OLLAMA_API_BASE_URL=http://ollama:11434/api'
      - 'WEBUI_SECRET_KEY='
    extra_hosts:
      - host.docker.internal:host-gateway
    restart: unless-stopped
    networks:
      - ai

  bolt-ai:
    image: bolt-ai:production
    build:
      context: .
      dockerfile: Dockerfile
      target: bolt-ai-production
    ports:
      - "5173:5173"
    env_file: ".env.local"
    environment:
      - NODE_ENV=production
      - COMPOSE_PROFILES=production
      # No strictly neded but serving as hints for Coolify
      - PORT=5173
      - GROQ_API_KEY=${GROQ_API_KEY}
      - OPENAI_API_KEY=${OPENAI_API_KEY}
      - ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
      - OPEN_ROUTER_API_KEY=${OPEN_ROUTER_API_KEY}
      - GOOGLE_GENERATIVE_AI_API_KEY=${GOOGLE_GENERATIVE_AI_API_KEY}
      - OLLAMA_API_BASE_URL=${OLLAMA_API_BASE_URL}
      - VITE_LOG_LEVEL=${VITE_LOG_LEVEL:-debug}
      - RUNNING_IN_DOCKER=true
    extra_hosts:
      - "host.docker.internal:host-gateway"
    command: pnpm run dockerstart
    profiles:
      - production  # This service only runs in the production profile
    networks:
      - ai

  bolt-ai-dev:
    image: bolt-ai:development
    build:
      target: bolt-ai-development
    environment:
      - NODE_ENV=development
      - VITE_HMR_PROTOCOL=ws
      - VITE_HMR_HOST=localhost
      - VITE_HMR_PORT=5173
      - CHOKIDAR_USEPOLLING=true
      - WATCHPACK_POLLING=true
      - PORT=5173
      - GROQ_API_KEY=${GROQ_API_KEY}
      - OPENAI_API_KEY=${OPENAI_API_KEY}
      - ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
      - OPEN_ROUTER_API_KEY=${OPEN_ROUTER_API_KEY}
      - GOOGLE_GENERATIVE_AI_API_KEY=${GOOGLE_GENERATIVE_AI_API_KEY}
      - OLLAMA_API_BASE_URL=${OLLAMA_API_BASE_URL}
      - VITE_LOG_LEVEL=${VITE_LOG_LEVEL:-debug}
      - RUNNING_IN_DOCKER=true
    extra_hosts:
      - "host.docker.internal:host-gateway"
    volumes:
      - type: bind
        source: .
        target: /app
        consistency: cached
      - /app/node_modules
    depends_on:
      - ollama
      - open-webui
    ports:
      - "5173:5173"  # Same port, no conflict as only one runs at a time
    command: pnpm run dev --host 0.0.0.0
    profiles: ["development", "default"]  # Make development the default profile
    networks:
      - ai

volumes:
  ollama: {}
  open-webui: {}
  bolt-ai-dev: {}

networks:
  ai:
    driver: bridge

And my Modelfile :


FROM qwen2.5-coder:7b
PARAMETER num_ctx 32768

The environment variables GROQ_API_KEY, OPENAI_API_KEY, ANTHROPIC_API_KEY, OLLAMA_API_BASE_URL, OPENAI_LIKE_API_KEY are filled in the .env.