Closed Laurian closed 1 month ago
Thanks @Laurian for raising this, and for the well detailed suggestion.
It is indeed something that I've been thinking about - as that message rendering component is quite important and responsible for a big part of the interaction with the LLM.
I like your suggestion of a custom component to wrap the message
. The message rendering is a bit complex as that's where we handle rendering markdown as it's being generated. But we can probably have a custom message renderer in React as following:
<AiChat messageRenderer={MyCustomMessageRenderer} />
With MyCustomMessageRenderer
defined as following:
import {useChatResponseRenderer} from '@nlux/react';
const CustomMessageRenderer = (observer: IObserver<DataType>, extras: OtherMessageRelatedInfo): ReactElement => {
const [ResponseRenderer, status] = useChatRenderer(observer);
return (
<div>
<div>Some custom stuff!</div>
<ResponseRenderer />
{(status === 'loading') && <div>Stuff to show when loading!</div>}
{(status === 'complete') && <div>Stuff to show when message is rendered!</div>}
</div>
);
}
ChatResponseRenderer
component will be aware of what's being generate and will render it as it should expected, use the appropriate rendering config.status
property allows developers to device what other content to render based on status.What do you think?
I'm moving this feature request into the Features Roadmap and I'll be prioritising it.
@salmenus Were working on this feature and we need the unminified nlux-core.js To be able to interact with the message component effectively
@TechWithTy are you using React or Vanilla JS ?
It's fairly easy with core Vanilla JS. but requires more work with React.
We are using ReactJS/Typescript And we are trying to modify this file but it is minified unfortunately nlux-core.txt
@salmenus , since we're using ReactJS/Typescript and attempting to enhance our chat functionality, we're encountering challenges with modifying the minified nlux-core.js file. As our application primarily relies on React, we're seeking guidance on how to approach this task effectively. We aim to add buttons for reactions and options to regenerate within our chat reply containers. Given that our application is React-based, how can we achieve this without directly manipulating the DOM, which is not allowed? Any insights or suggestions on how we could tackle this within the React ecosystem would be greatly appreciated. Thanks! #chatgpt-feature
Ok. Given the numbers of requests, I'm prioritising this issue.
Moving it to In Progress
.
It's going to be similar to the JSX for user personas: You will be able to provide your own react component that can handle the rendering of the chat message.
You'll have it ready in the next 48 hours.
Awesome @salmenus
Still WIP. I had to refactor the whole part of the library related to DOM rendering and update for a seemless support of JS and React. I'm aiming to publish a new version tomorrow with new rendering and support for custom messages.
I'll keep you posted via this thread.
Thank you, @salmenus
Still work-in-progress. NPM not published yet.
Progress Update:
I'll keep you posted via this thread once new NPM is published. Code change here.
Awesome @salmenus Thank you!
This clearly took more than 48 hours! I'm pushing changes to this PR and aiming to merge later this week.
I had to do an entire refactoring of the view/UI layer of both React and JS ports of the library. It's time consuming and took longer than expected, but I think it very beneficial for the long run.
Awesome can't wait! Is there any low hanging fruit we could help with?
PR merged into main branch with new React implementation. RC following.
Lets goo!
Custom renderers are finally here ! π Along with a full re-write of NLUX React layer βοΈ
This has just been released as part of 2.1.0-beta
This is a major release with several changes to config options, so expect some breaking changes if you're using v1.x
.
You can give the feature a try in this code sandbox: https://codesandbox.io/p/sandbox/wild-rose-84wyvw?file=%2Fsrc%2FApp.tsx
The custom components provided as part of responseComponent
and supports both streaming
and fetch
modes.
For fetch mode, you will also get the full JSON returned from the server.
This major code change and React re-write will enable several other features (RSC, component streaming, etc). And, similar to other parts of the NLUX code base, it's a high quality high perf code change covered with 600+ unit tests.
We're currently working on updating docs and improving theming (the last piece of this major code change). Meanwhile, you can look at Typescript type definitions or code base to figure out option values.
--
In action, from the code sandbox example linked above:
const MyCustomResponseRenderer: ResponseRenderer<string> = (
props: FetchResponseComponentProps<string> | StreamResponseComponentProps<string>
) => {
console.log("Data fetched from LangServe!");
console.dir(props);
const propsForFetch = props as FetchResponseComponentProps<string>;
const propsForStream = props as StreamResponseComponentProps<string>;
const dataTransferMode = props.dataTransferMode as any;
return (
<>
{dataTransferMode === "fetch" && <div>{propsForFetch.content}</div>}
{dataTransferMode === "stream" && (<div ref={propsForStream.containerRef} />)}
<div>Footer Custom Response Component</div>
</>
);
};
<AiChat
adapter={adapter}
messageOptions={{
responseComponent: MyCustomResponseRenderer
}}
/>
Reference doc on custom adapters is now available here: https://docs.nlkit.com/nlux/reference/ui/custom-renderers
Examples on docs website with both streamed and batched custom adapters here: https://docs.nlkit.com/nlux/examples/custom-response-renderers
Also β NLUX v2
is now released π
With this features included β
Several chat UIs out there have extra widgets along a message like thumbs up / down, share, report, etc.
It would be handy to have a way to decorate the message box with interactive components. I think there are 3 ways to approach this:
custom components to render messages
custom component to wrap the message (keep current rendering)
injecting a widget/markup in the current
nluxc-text-message-content
div
elementIn Discord @salmenus mentioned something along passing a custom rendered:
<AiChat messageRenderer={MyCustomMessageRenderer} />
type CustomMessageRenderer = (message: string, extras: OtherMessageRelatedInfo): ReactElement
But I guess this won't work in stream mode, I mean it will work as in fetch mode unless we make the component also deal with the streaming adapter.
Alternatively a wrapper would just decorate the message which is rendered by the original core component (
<AiChat />
renders<MyCustomMessageWrapper message={message} extras={extras}>{children}</Myβ¦>
) where{children}
is the original core component that does handle streaming, markdown, etc. And I guess while streaming themessage
andextras
props will get updated, maybe acomplete
prop (fed from the streaming adapter observer?) would be needed too to know that now I can interact with my decorations?Like 2, let's keep the original rendering but instead of wrapping it just add some custom component in React or markup in js:
<AiChat messageHeader={MyCustomMessageHeader} messageFooter={MyCustomMessageFooter} />
But even injecting plain markup would be enough with React, having
messageHeader={<div className="header"></div>}
would allow me to render what I need inside withcreatePortal
.It would be nice if the customisation could be done in a way to works at core component level, such that could be used in both js and react.