Closed utilityfog closed 5 months ago
Hi @utilityfog , Thank you very much for reporting this and sorry for the late response. I'll look into this and see if I can fix this issue before the end of the week.
@utilityfog The issue has been fixed and deployed for the pre-release version. The bug was in the parsing of the streamed response, which implicitly assumed how the OpenAI server would chunk the stream without taking the necessary precautions to deal with a change in behaviour from their server. Unfortunately, it will affect all users, no matter their configuration.
If you are in a hurry, you can switch to the pre-release version this way:
Once I incorporate other updates (specifically, #11), the fix will be merged into the regular release version and this issue will then be closed.
I am thankful for any feature requests regarding the VSCode extension. Please feel free to create a new GitHub issue if you have anything in mind!
The fix for this issue and update of dependencies are now rolled out to the regular release version of the extension.
Hi Alex! Thank you for the prompt fix of the bug. I have verified that cell completion now works correctly on my device, which is an ARM64 mac.
When running this extension on an ARM64 device, such as Macbook Pro M3 Max, I always receive the error below when passing a cell's contents to the extension's chat completion API handler:
I tried debugging the extension itself, but eventually gave up. It seems that this extension is incompatible with ARM 64 devices as of now. Please have a look!