nlkitai / nlux

The π—£π—Όπ˜„π—²π—Ώπ—³π˜‚π—Ή Conversational AI JavaScript Library πŸ’¬ β€”Β UI for any LLM, supporting LangChain / HuggingFace / Vercel AI, and more 🧑 React, Next.js, and plain JavaScript ⭐️
https://docs.nlkit.com/nlux
Other
937 stars 48 forks source link

Error with line break when using streaming #67

Closed ScreamZ closed 3 weeks ago

ScreamZ commented 1 month ago

Hi,

Have a look at my code

export const MyAdapter: ChatAdapter = {
  streamText: async (message: string, observer: StreamingAdapterObserver) => {
    const result = await sendMessage({
      // TODO:
      supaFriend: "clwt7czaf0000fiuonrkj7dlk",
      // TODO
      conversation: "clwt7e3zg00037csd347yqcs3",
      userInputMessage: message,
    });

    console.log(result);

    const parseResult = parseResponseMessageParsing(result.data);

    if (!result.serverError && parseResult.status === "success") {
      console.log("parsed", parseResult.data.message);
      observer.next(parseResult.data.message);
      observer.complete();
    } else {
      observer.error(new Error(result.serverError || "Could not generate message"));
    }
  },
};

This gives me

image

As you can see single \n are getting ignored. While double are working. Why not, but let's look further.

Screenshot 2024-05-31 at 17 02 05

I just refreshed the page and printed the last message in console. Everything get passed to initialConversation.

Now it works.

Do you have any explications about this ?

Thanks

salmenus commented 1 month ago

This is interesting!

The initialConversation uses a different markdown rendering and parser logic (soft line breaks) while the streaming renderer uses hard line breaks.

This is a bug. It's due to difference in markdown implementation between streaming and snapshots.

We'll fix it in 2 steps:

I'll comment on this issue once a fixed is published.

ScreamZ commented 4 weeks ago

This is interesting!

The initialConversation uses a different markdown rendering and parser logic (soft line breaks) while the streaming renderer uses hard line breaks.

This is a bug. It's due to difference in markdown implementation between streaming and snapshots.

We'll fix it in 2 steps:

  • I'll change the config of snapshot streamer to use hard line breaks (same as stream renderer)
  • In future, both streaming and snapshot should use exactly the same code for parsing

I'll comment on this issue once a fixed is published.

Highly appreciated

salmenus commented 3 weeks ago

This should be fixed in 2.3.8 and later ..

Markdown streaming has been refactored, to use the same parser everywhere:

salmenus commented 3 weeks ago

Resolving this as confirmed fixed by @ScreamZ on Discord.