bramses / chatgpt-md

A (nearly) seamless integration of ChatGPT into Obsidian.
MIT License
824 stars 61 forks source link

Some weirdness going on when chatGPT is outputting code blocks and/or [[ #15

Closed tophee closed 1 year ago

tophee commented 1 year ago

I'm not sure what is going on, but I think it might be related to [[ being output in stream mode and/or within a code block. I guess they need to be escaped somehow. See here: https://cln.sh/zkG7Zmsj Hope this is useful.

bramses commented 1 year ago

Hi @tophee,

Yes, this is because the plugin is writing directly to Obsidian's editor when stream is on (see https://github.com/bramses/chatgpt-md/blob/master/main.ts#L164-L170), as if you were manually "typing" the characters really fast.

In fact, I had to artificially slow the backtick character streaming in because Obsidian's editor processes them slightly slower than other characters (idk why). So what you're seeing is just the name of the game, unfortunately.

If it bothers you, you can set stream to false and it should load in as one large block. Does that make sense?

tophee commented 1 year ago

You mean it’s a feature, not a bug?

In any case, the first thing I tried was to turn stream off but that didn’t stop it from streaming. Then i saw that it was still set to true in Default Chat Frontmatter so I changed it there too, but it still keeps streaming…

bramses commented 1 year ago

@tophee you have to set it to false, the default in the system is stream: true.

EDIT: just checked, there seems to be a bug somewhere in the logic, I'll mark it for next release

You mean it’s a feature, not a bug?

haha, yeah, I suppose

bramses commented 1 year ago

fixed with https://github.com/bramses/chatgpt-md/releases/tag/1.1.1

tophee commented 1 year ago

Thanks for fixing this! But may I ask why you consider the behaviour with square brackets as a feature?

As I think about it, maybe I misunderstood you and it's not so much a feature but a the natural behaviour when streaming text into obsidian? In that case: are you planning to make it possible to turn this off? I'm not sure what kind of effort is required for this in terms of code, but I'd imagine it to be possible to somehow temporarily deactivate things like auto-completion (or things like that) during streaming, no?

BTW: do you know whether what we see in streaming mode is the as it is being produced more or less in real-time (which would mean that streaming provides faster responses) or is the stream fake in the sense that the entire answer is already produced before the streaming starts (which would mean that turning streaming mode off will not slow down responses).

bramses commented 1 year ago

In that case: are you planning to make it possible to turn this off? I'm not sure what kind of effort is required for this in terms of code, but I'd imagine it to be possible to somehow temporarily deactivate things like auto-completion (or things like that) during streaming, no?

I meant feature as in it's built into Obsidian. ChatGPT MD has no control over how Obsidian writes to its own editor (https://github.com/obsidianmd/obsidian-api/blob/master/obsidian.d.ts#L902). It merely takes data from the OpenAI response and appends it to the editor. Anything lower level would probably break CodeMirror or cause some other unforeseen issue.

Edit: That being said, if you do find a solution, I'd be happy to accept it, please feel free to PR!

BTW: do you know whether what we see in streaming mode is the as it is being produced more or less in real-time (which would mean that streaming provides faster responses) or is the stream fake in the sense that the entire answer is already produced before the streaming starts (which would mean that turning streaming mode off will not slow down responses).

As of now, the stream is fake, yes. I'm looking into an Event Source patch but that may or may not work, idk yet