docker / labs-ai-tools-vscode

Run & debug workflows for AI agents running Dockerized tools in VSCode
Other
59 stars 5 forks source link

Json: Error: Connection error. #35

Closed jovicon closed 1 week ago

jovicon commented 5 months ago

Hi, I tried make runbook v0.0.10

and I had this problem:

Error: Connection error.

This is my docker file:

FROM node:20.13.1-alpine

# Docker Scout -- Vulnerability Scanning solved
RUN apk upgrade busybox 
RUN apk upgrade openssl 

WORKDIR /usr/src/app

COPY ["package.json", "./"]
COPY [".env.example", "./.env"]

RUN npm install

COPY . .

RUN npm run build

EXPOSE 3000

CMD node ./dist/src/main.js

and this is my extension config:

Captura de Pantalla 2024-06-21 a la(s) 18 27 54

ColinMcNeil commented 4 months ago

Hi, thanks for submitting this issue. At the moment, we only support the Ollama endpoint http://localhost:11434/v1 so if you have it running elsewhere this would cause the error.

jovicon commented 3 months ago

Is it possible to take this as an issue with a Merge request and add a custom field to add the local endpoint?

slimslenderslacks commented 1 month ago

@jovicon we've been migrating these runbook prompts to allow you to specify the url in the prompt. So you'll be able to specify the location of the OpenAI compatible endpoint when you run the prompt. It's a bit of a different flow than what we used previously but it will be more open to edits. I'll keep this issue open until we have a doc describing how you can try this.

ColinMcNeil commented 1 week ago

Closing this one! Please feel free to checkout the main repo with new docs at https://github.com/docker/labs-ai-tools-for-devs. Support llama with the following front-matter

---
url: http://llama-endpoint.local/
model: llama3.2
---

# prompt system
Prompt content