Christopher-Hayes / vscode-chatgpt-reborn

Refactor, improve, and debug your code in VSCode with GPT-3 and GPT-4.
https://marketplace.visualstudio.com/items?itemName=chris-hayes.chatgpt-reborn
ISC License
207 stars 38 forks source link

Broken detection of code section boundaries #50

Open wojtekcz opened 11 months ago

wojtekcz commented 11 months ago

Describe the Bug

  1. There should be one code section rendered in conversation window, starting after "Here's an example code snippet..." line and ending before line "In this example..."
image
  1. Also there seems something fishy going on with closing paragraph HTML tag </p> at the end of rendered model responses, sometimes one or more > characters are being added. I don't know how to determine, if it is a bad model output, or a problem with rendering. Is there a way to examine raw, streamed, model responses?
image

There are other rendering and parsing problems with code sections that can be seen in attached file.

chat_6_transcript.md

Where are you running VSCode? (Optional)

MacOS

Which OpenAI model are you using? (Optional)

Llama 2 based model, running on a Mac, served with LM Studio

Additional context (Optional)

extension v3.19.1

Christopher-Hayes commented 11 months ago

Llama 2, nice. Will take a look. Correctly formatting the markdown has been bug-prone, will try to figure out what's going wrong here.

The markdown export should have the raw output from the model (other than the <code>**[ChatGPT]**</code> parts), but there's still a chance non-raw output is getting in there.

Thinking of adding a way to see the raw output in the chat UI, ie using a toggle. Since users sometimes want the actual Markdown code, and it can be tricky to get the markdown without the plugin converting to HTML.