Robitx / gp.nvim

Gp.nvim (GPT prompt) Neovim AI plugin: ChatGPT sessions & Instructable text/code operations & Speech to text [OpenAI]
MIT License
537 stars 49 forks source link

The response doesn't update promptly in my neovim editor #110

Closed hallaji closed 4 days ago

hallaji commented 4 months ago

Hi 👋

For some reason, I need to wait until the complete response appears. The response doesn't update promptly in my neovim editor. Mine is totally blank until it suddenly appears.

What could be causing this issue? Is there a configuration option available to enable this feature?

Robitx commented 4 months ago

Hey, could you try running curl to check how it behaves by itself?

curl https://api.openai.com/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $OPENAI_API_KEY" \
  -d '{
    "model": "gpt-3.5-turbo",
    "messages": [
      {
        "role": "system",
        "content": "You are a helpful assistant."
      },
      {
        "role": "user",
        "content": "Hello!"
      }
    ],
    "stream": true
  }'
hallaji commented 4 months ago

I see the chunks as a result:

data: {"id":"chatcmpl-***","object":"chat.completion.chunk","created":1709256554,"model":"gpt-3.5-turbo-0125","system_fingerprint":"fp_***","choices":[{"index":0,"delta":{"role":"assistant","content":""},"logprobs":null,"finish_reason":null}]}

data: {"id":"chatcmpl-***","object":"chat.completion.chunk","created":1709256554,"model":"gpt-3.5-turbo-0125","system_fingerprint":"fp_***","choices":[{"index":0,"delta":{"content":"Hello"},"logprobs":null,"finish_reason":null}]}

data: {"id":"chatcmpl-***","object":"chat.completion.chunk","created":1709256554,"model":"gpt-3.5-turbo-0125","system_fingerprint":"fp_***","choices":[{"index":0,"delta":{"content":"!"},"logprobs":null,"finish_reason":null}]}

data: {"id":"chatcmpl-***","object":"chat.completion.chunk","created":1709256554,"model":"gpt-3.5-turbo-0125","system_fingerprint":"fp_***","choices":[{"index":0,"delta":{"content":" How"},"logprobs":null,"finish_reason":null}]}

data: {"id":"chatcmpl-***","object":"chat.completion.chunk","created":1709256554,"model":"gpt-3.5-turbo-0125","system_fingerprint":"fp_***","choices":[{"index":0,"delta":{"content":" can"},"logprobs":null,"finish_reason":null}]}

data: {"id":"chatcmpl-***","object":"chat.completion.chunk","created":1709256554,"model":"gpt-3.5-turbo-0125","system_fingerprint":"fp_***","choices":[{"index":0,"delta":{"content":" I"},"logprobs":null,"finish_reason":null}]}

data: {"id":"chatcmpl-***","object":"chat.completion.chunk","created":1709256554,"model":"gpt-3.5-turbo-0125","system_fingerprint":"fp_***","choices":[{"index":0,"delta":{"content":" assist"},"logprobs":null,"finish_reason":null}]}

data: {"id":"chatcmpl-***","object":"chat.completion.chunk","created":1709256554,"model":"gpt-3.5-turbo-0125","system_fingerprint":"fp_***","choices":[{"index":0,"delta":{"content":" you"},"logprobs":null,"finish_reason":null}]}

data: {"id":"chatcmpl-***","object":"chat.completion.chunk","created":1709256554,"model":"gpt-3.5-turbo-0125","system_fingerprint":"fp_***","choices":[{"index":0,"delta":{"content":" today"},"logprobs":null,"finish_reason":null}]}

data: {"id":"chatcmpl-***","object":"chat.completion.chunk","created":1709256554,"model":"gpt-3.5-turbo-0125","system_fingerprint":"fp_***","choices":[{"index":0,"delta":{"content":"?"},"logprobs":null,"finish_reason":null}]}

data: {"id":"chatcmpl-***","object":"chat.completion.chunk","created":1709256554,"model":"gpt-3.5-turbo-0125","system_fingerprint":"fp_***","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}]}

data: [DONE]