twinnydotdev / twinny

The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
https://twinny.dev
MIT License
2.36k stars 130 forks source link

Error parsing JSON: TypeError: Cannot read properties of undefined (reading '0') #179

Closed DenisBY closed 4 months ago

DenisBY commented 4 months ago

Describe the bug I installed vs code plugin v3.7.19 and local ollama in Docker using official container from Docker Hub. Via curl it seems it somehow works:

curl -H "Content-Type: application/json" -H "Authorization: Bearer " http://localhost:11434/api/chat -d '{
  "model": "codellama:latest",
  "prompt": "You are a helpful, respectful and honest coding assistant.\nAlways reply with using markdown.\nFor code refactoring, use markdown with code formatting.\n  \n[INST] hey\n [/INST]\n  ",
  "stream": true,
  "messages": [
    {
      "role": "system",
      "content": "You are a helpful, respectful and honest coding assistant.\nAlways reply with using markdown.\nFor code refactoring, use markdown with code formatting.\n  "
    },
    {
      "role": "user",
      "content": "hey",
      "type": "chat"
    }
  ],
  "keep_alive": "5m",
  "options": {
    "temperature": 0.2,
    "num_predict": 512
  }
}'
{"model":"codellama:latest","created_at":"2024-03-18T12:29:46.21718317Z","message":{"role":"assistant","content":"G"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:46.310958751Z","message":{"role":"assistant","content":"reet"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:46.408566108Z","message":{"role":"assistant","content":"ings"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:46.496987368Z","message":{"role":"assistant","content":"!"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:46.590145573Z","message":{"role":"assistant","content":" I"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:46.685530001Z","message":{"role":"assistant","content":"'"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:46.778263718Z","message":{"role":"assistant","content":"m"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:46.870927181Z","message":{"role":"assistant","content":" here"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:46.963743624Z","message":{"role":"assistant","content":" to"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:47.063615447Z","message":{"role":"assistant","content":" help"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:47.15785771Z","message":{"role":"assistant","content":" you"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:47.252329066Z","message":{"role":"assistant","content":" with"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:47.3463564Z","message":{"role":"assistant","content":" any"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:47.439644987Z","message":{"role":"assistant","content":" questions"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:47.533518742Z","message":{"role":"assistant","content":" or"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:47.626457107Z","message":{"role":"assistant","content":" issues"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:47.725030283Z","message":{"role":"assistant","content":" you"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:47.816541095Z","message":{"role":"assistant","content":" might"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:47.909774705Z","message":{"role":"assistant","content":" have"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:48.000492013Z","message":{"role":"assistant","content":"."},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:48.094301973Z","message":{"role":"assistant","content":" What"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:48.190053917Z","message":{"role":"assistant","content":" would"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:48.288297568Z","message":{"role":"assistant","content":" you"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:48.389172832Z","message":{"role":"assistant","content":" like"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:48.482813668Z","message":{"role":"assistant","content":" to"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:48.575548229Z","message":{"role":"assistant","content":" know"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:48.669604156Z","message":{"role":"assistant","content":" or"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:48.763074897Z","message":{"role":"assistant","content":" discuss"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:48.856927548Z","message":{"role":"assistant","content":"?"},"done":false}
{"model":"codellama:latest","created_at":"2024-03-18T12:29:48.950593332Z","message":{"role":"assistant","content":""},"done":true,"total_duration":2829473921,"load_duration":175548,"prompt_eval_duration":95530000,"eval_count":30,"eval_duration":2733365000}

However via extension it doesn't:

[Extension Host] 
***Twinny Stream Debug***
Streaming response from 127.0.0.1:11434.
Request body:
{
  "model": "codellama:latest",
  "prompt": "You are a helpful, respectful and honest coding assistant.\nAlways reply with using markdown.\nFor code refactoring, use markdown with code formatting.\n  \n[INST] hey\n [/INST]\n  ",
  "stream": true,
  "messages": [
    {
      "role": "system",
      "content": "You are a helpful, respectful and honest coding assistant.\nAlways reply with using markdown.\nFor code refactoring, use markdown with code formatting.\n  "
    },
    {
      "role": "user",
      "content": "hey",
      "type": "chat"
    }
  ],
  "keep_alive": "5m",
  "options": {
    "temperature": 0.2,
    "num_predict": 512
  }
}

Request options:
{
  "hostname": "127.0.0.1",
  "port": 11434,
  "path": "/api/chat",
  "protocol": "http",
  "method": "POST",
  "headers": {
    "Content-Type": "application/json",
    "Authorization": "Bearer "
  }
}

ERR [Extension Host] Error parsing JSON: TypeError: Cannot read properties of undefined (reading '0')
    at e.getChatDataFromProvider (/home/user/.vscode/extensions/rjmacarthy.twinny-3.7.19/out/index.js:2:130124)
    at onStreamData (/home/user/.vscode/extensions/rjmacarthy.twinny-3.7.19/out/index.js:2:90914)
    at Object.transform (/home/user/.vscode/extensions/rjmacarthy.twinny-3.7.19/out/index.js:2:121478)
    at ensureIsPromise (node:internal/webstreams/util:192:19)
    at transformStreamDefaultControllerPerformTransform (node:internal/webstreams/transformstream:509:18)
    at transformStreamDefaultSinkWriteAlgorithm (node:internal/webstreams/transformstream:559:10)
    at Object.write (node:internal/webstreams/transformstream:364:14)
    at ensureIsPromise (node:internal/webstreams/util:192:19)
    at writableStreamDefaultControllerProcessWrite (node:internal/webstreams/writablestream:1114:5)
    at writableStreamDefaultControllerAdvanceQueueIfNeeded (node:internal/webstreams/writablestream:1229:5)
    at writableStreamDefaultControllerWrite (node:internal/webstreams/writablestream:1103:3)
    at writableStreamDefaultWriterWrite (node:internal/webstreams/writablestream:993:3)
    at [kChunk] (node:internal/webstreams/readablestream:1401:28)
    at readableStreamFulfillReadRequest (node:internal/webstreams/readablestream:1992:24)
    at readableStreamDefaultControllerEnqueue (node:internal/webstreams/readablestream:2183:5)
    at transformStreamDefaultControllerEnqueue (node:internal/webstreams/transformstream:490:5)
    at TransformStreamDefaultController.enqueue (node:internal/webstreams/transformstream:301:5)
    at Object.transform (node:internal/webstreams/encoding:156:22)
    at ensureIsPromise (node:internal/webstreams/util:192:19)
    at transformStreamDefaultControllerPerformTransform (node:internal/webstreams/transformstream:509:18)
    at transformStreamDefaultSinkWriteAlgorithm (node:internal/webstreams/transformstream:559:10)
    at Object.write (node:internal/webstreams/transformstream:364:14)
    at ensureIsPromise (node:internal/webstreams/util:192:19)
    at writableStreamDefaultControllerProcessWrite (node:internal/webstreams/writablestream:1114:5)
    at writableStreamDefaultControllerAdvanceQueueIfNeeded (node:internal/webstreams/writablestream:1229:5)
    at writableStreamDefaultControllerWrite (node:internal/webstreams/writablestream:1103:3)
    at writableStreamDefaultWriterWrite (node:internal/webstreams/writablestream:993:3)
    at [kChunk] (node:internal/webstreams/readablestream:1401:28)
    at readableStreamFulfillReadRequest (node:internal/webstreams/readablestream:1992:24)
    at readableStreamDefaultControllerEnqueue (node:internal/webstreams/readablestream:2183:5)
    at transformStreamDefaultControllerEnqueue (node:internal/webstreams/transformstream:490:5)
    at TransformStreamDefaultController.enqueue (node:internal/webstreams/transformstream:301:5)
    at Object.identityTransformAlgorithm [as transform] (node:internal/deps/undici/undici:11066:22)
    at ensureIsPromise (node:internal/webstreams/util:192:19)
    at transformStreamDefaultControllerPerformTransform (node:internal/webstreams/transformstream:509:18)
    at node:internal/webstreams/transformstream:554:16
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)

To Reproduce Steps to reproduce the behavior:

  1. Install extension
  2. Install llama using Docker with, i.e. 'codellama:latest' model
  3. Write "hey' into chatbox
  4. See error

Additional context If I change path to /v1/chat/completions, llama returns 404. But both /api/chat and /api/generate give the same result, via curl and in extension.

rjmacarthy commented 4 months ago

Hey, have you tried to update Ollama?

DenisBY commented 4 months ago

I use ollama version is 0.1.29

DenisBY commented 4 months ago

So, I fixed it. For some reason I had version 0.1.22, and played with it with changing paths, etc. After your suggestion I updated to 0.1.29, reverted my changes back (/api/chat and /api/generate -> /v1/chat/completions) and now it's working. Thank you!