Closed jwr closed 3 months ago
### Are you familiar with Clojure?
, I'm familiar with Clojure. It's a modern Lisp dialect that runs on the Java Virtual Machine (JVM) and JavaScript engines. Clojure emphasizes functional programming, immutable data structures, and concurrency support through software transactional memory. It has a rich set of data structures and a focus on simplicity and consistency.
###
But this time it seems that 3 characters ("Yes") are missing?
This is fairly consistent:
### Can you write Clojure code?
, I can write Clojure code. Here's a simple example that defines a function to calculate the factorial of a number:
I was wrong, it's not two characters in front, but some characters in multiple places in the response.
I am looking at a gptel chat buffer where there are several places where 1-5 characters are missing (streaming chunk boundaries?), not just at the beginning, but in the middle of the response as well. I see this with Anthropic only, not with OpenAI or ollama.
(I am back to current head, or 0d6264f)
I am looking at a gptel chat buffer where there are several places where 1-5 characters are missing (streaming chunk boundaries?), not just at the beginning, but in the middle of the response as well. I see this with Anthropic only, not with OpenAI or ollama.
Thank you for the thorough testing, this is very helpful. It's probably a bug in the parser for the Anthropic-API responses. Could you do one more thing to help me track it down?
(setq gptel-log-level 'info)
*gptel-log*
buffer here. (Check to ensure that the buffer does not contain your API key. At the info
log level it shouldn't, but please check anyway)Unfortunately, I currently can't, as Anthropic locked me out when my $5 credit ran out, and even though I recharged the account and it shows a significant balance in the panel, their API responds with 400 codes. And their support mentions that they respond after 5 days 🤣
If somebody else doesn't help in the meantime, this will have to wait until they tie their shoelaces and get their act together, then I'll be able to get back on it!
I have the same problem, here's my data.
ChatGPT buffer:
### is this on?
, I'm here and ready to assist you. How can I help?
###
gptel-log
{
"gptel": "request body",
"timestamp": "2024-03-07 17:47:00"
}
{
"model": "claude-3-opus-20240229",
"messages": [
{
"role": "user",
"content": "is this on?"
}
],
"system": "You are a large language model living in Emacs and a helpful assistant. Respond concisely.",
"stream": true,
"max_tokens": 1024,
"temperature": 1.0
}
{
"gptel": "response body",
"timestamp": "2024-03-07 17:47:05"
}
event: message_start
data: {"type":"message_start","message":{"id":"msg_01CXFfSUxbDeyrqJkwPj1UnU","type":"message","role":"assistant","content":[],"model":"claude-3-opus-20240229","stop_reason":null,"stop_sequence":null,"usage":{"input_tokens":33,"output_tokens":1}}}
event: content_block_start
data: {"type":"content_block_start","index":0,"content_block":{"type":"text","text":""}}
event: ping
data: {"type": "ping"}
event: content_block_delta
data: {"type":"content_block_delta","index":0,"delta":{"type":"text_delta","text":"Yes"}}
event: content_block_delta
data: {"type":"content_block_delta","index":0,"delta":{"type":"text_delta","text":","}}
event: content_block_delta
data: {"type":"content_block_delta","index":0,"delta":{"type":"text_delta","text":" I"}}
event: content_block_delta
data: {"type":"content_block_delta","index":0,"delta":{"type":"text_delta","text":"'m"}}
event: content_block_delta
data: {"type":"content_block_delta","index":0,"delta":{"type":"text_delta","text":" here"}}
event: content_block_delta
data: {"type":"content_block_delta","index":0,"delta":{"type":"text_delta","text":" and"}}
event: content_block_delta
data: {"type":"content_block_delta","index":0,"delta":{"type":"text_delta","text":" ready"}}
event: content_block_delta
data: {"type":"content_block_delta","index":0,"delta":{"type":"text_delta","text":" to"}}
event: content_block_delta
data: {"type":"content_block_delta","index":0,"delta":{"type":"text_delta","text":" assist"}}
event: content_block_delta
data: {"type":"content_block_delta","index":0,"delta":{"type":"text_delta","text":" you"}}
event: content_block_delta
data: {"type":"content_block_delta","index":0,"delta":{"type":"text_delta","text":"."}}
event: content_block_delta
data: {"type":"content_block_delta","index":0,"delta":{"type":"text_delta","text":" How"}}
event: content_block_delta
data: {"type":"content_block_delta","index":0,"delta":{"type":"text_delta","text":" can"}}
event: content_block_delta
data: {"type":"content_block_delta","index":0,"delta":{"type":"text_delta","text":" I"}}
event: content_block_delta
data: {"type":"content_block_delta","index":0,"delta":{"type":"text_delta","text":" help"}}
event: content_block_delta
data: {"type":"content_block_delta","index":0,"delta":{"type":"text_delta","text":"?"}}
event: content_block_stop
data: {"type":"content_block_stop","index":0}
event: message_delta
data: {"type":"message_delta","delta":{"stop_reason":"end_turn","stop_sequence":null},"usage":{"output_tokens":19}}
event: message_stop
data: {"type":"message_stop"}
The log does show that the very first event is dropped in the chat.
@solodov Thank you.
It's strange, I've tried all the prompts suggested in this thread so far and I'm not able to reproduce the missing chunk problem. It works fine on the test case. I'm trying to guess the cause from staring at the parser code now.
Since I'm not sure what's causing the parsing problem, I've attempted a fix based on my best guess. Please let me know if it makes a difference.
Looks like my case now works correctly, thanks!
On Fri, Mar 8, 2024 at 2:06 AM karthink @.***> wrote:
Since I'm not sure what's causing the parsing problem, I've attempted a fix based on my best guess. Please let me know if it makes a difference.
— Reply to this email directly, view it on GitHub https://github.com/karthink/gptel/issues/233#issuecomment-1984868670, or unsubscribe https://github.com/notifications/unsubscribe-auth/AABN2WRJ2GMQQTMAP35A2LLYXEFKRAVCNFSM6AAAAABEJW6N52VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSOBUHA3DQNRXGA . You are receiving this because you were mentioned.Message ID: @.***>
Okay. I'll wait until @jwr can access Claude again and check if the bug still persists before closing this issue.
Sorry, it took a while for Anthropic to figure out that I do have a positive balance in my account after all.
I can now confirm that the bug is gone, and I get full responses from Anthropic models (tested with fbb0ee2).
Thank you!
I'm trying out gptel with Anthropic Claude and it seems that with many (but curiously not all, I think?) responses the first two characters are lost somewhere.
For example, with an empty prompt in a conversation:
Or when working in another buffer:
Note how in the first example the word "It" is missing, and in the second one "In" is missing.
I'm using gptel 0d6264f in "GNU Emacs 29.1 (build 1, aarch64-apple-darwin21.6.0, Carbon Version 165 AppKit 2113.6) of 2023-08-10" (emacs-mac).