nlkitai / nlux

The π—£π—Όπ˜„π—²π—Ώπ—³π˜‚π—Ή Conversational AI JavaScript Library πŸ’¬ β€”Β UI for any LLM, supporting LangChain / HuggingFace / Vercel AI, and more 🧑 React, Next.js, and plain JavaScript ⭐️
https://docs.nlkit.com/nlux
Other
937 stars 48 forks source link

adapter memory #80

Closed sylvain471 closed 3 weeks ago

sylvain471 commented 3 weeks ago

Hello,

I can't figure out how to add memory to my custom adapter. My custom adapter uses Ollama chat api to serve phi3:3.8b locally.

I store message memory using messageReceivedCallback and messageSentCallback, but I just don't see how to pass this to the adapter...

I guess my problem really comes from a lack of understanding of what react/js actually does, or that I should use a ChatAdapter somehow, but any hint to put me on the right path would be much appreciated!

import {AiChat} from '@nlux/react';
import '@nlux/themes/nova.css';

const personaOptions = {assistant: {name: 'DatAI'},user: {name: 'Me'}};

const message = [
  {
    role: 'system',
    message: "You are an expert in data analysis, making sense of unstructed data."
  },
  {
    role: 'user',
    message: "Hi, I need your insights on the data I have collected." 
  },
  {
    role: 'assistant',
    message: "Sure, let see what sense we can make of these data!"
  }
]

const streamAdapter = {
  streamText: async (prompt, observer) => {
    const message = [];
    message.push({"role": "user","content": prompt,})
    const chatURL = 'http://localhost:11434/api/chat';
    const response = await fetch(chatURL, {
        method: 'POST',
        headers: {
          'Content-Type': 'application/json'
        },
        body: JSON.stringify({
            'model': 'phi3:3.8b',
            'messages':message,
            'stream':true,
            'temperature':0.0,
        })
      })  

    if (response.status !== 200) {observer.error(new Error('Failed to connect to the server'));return;}
    if (!response.body) {return;}

    const reader = response.body.getReader();
    const textDecoder = new TextDecoder();
    let doneReading = false;

    while (!doneReading) {
      const { value, done } = await reader.read();
      if (done) {doneReading = true;continue;}

      const content = textDecoder.decode(value);
      if (content) {
        try {
            observer.next(JSON.parse(content).message.content);
            message.push({
                "role": "assistant",
                "content": JSON.parse(content).message.content,
            })
        }
        catch (error) {observer.next('--');console.log(error);}      
      }
    }
    observer.complete();
  }
};

const messageReceivedCallback = (ev) => {
  message.push({"role":"assistant","message": ev.message.join('')});
  console.log(message);
};

const messageSentCallback = (ev) => {
  message.push({"role":"user","message": ev.message});
};

const eventCallbacks = {
  messageReceived: messageReceivedCallback,
  messageSent: messageSentCallback,
};

function MiniChat()  {
    return (
      <AiChat 
        initialConversation={message}
        adapter={streamAdapter}
        events={eventCallbacks} 
        personaOptions={personaOptions}
      />
    );

};

export default MiniChat;

I also have some troubles with fetch body sometimes catching more than one item, but this I can probably find a work around!

Thanks for this nice project!

sylvain471 commented 3 weeks ago

well, my previous post contained just so many inconsistent pieces of code, no wonder why it did not work.... here is an update that actually does what I expected

import {AiChat} from '@nlux/react';
import '@nlux/themes/nova.css';

const personaOptions = {assistant: {name: 'DatAI'},user: {name: 'Me'}};

const message = [
  {
    role: 'system',
    message: "You are an expert in data analysis, making sense of unstructed data."
  },
  {
    role: 'user',
    message: "Hi, I need your insights on the data I have collected." 
  },
  {
    role: 'assistant',
    message: "Sure, let see what sense we can make of these data!"
  }
]

const streamAdapter = {
  streamText: async (prompt, observer) => {

    message.push({"role": "user","content": prompt})
    const chatURL = 'http://localhost:11434/api/chat';
    const response = await fetch(chatURL, {
        method: 'POST',
        headers: {
          'Content-Type': 'application/json'
        },
        body: JSON.stringify({
            'model': 'phi3:3.8b',
            'messages':message,
            'stream':true,
            'temperature':0.0,
        })
      })  

    if (response.status !== 200) {observer.error(new Error('Failed to connect to the server'));return;}
    if (!response.body) {return;}

    const reader = response.body.getReader();
    const textDecoder = new TextDecoder();
    let doneReading = false;

    while (!doneReading) {
      const { value, done } = await reader.read();
      if (done) {doneReading = true;continue;}

      const content = textDecoder.decode(value);
      if (content) {
        try {
            observer.next(JSON.parse(content).message.content);
        }
        catch (error) {observer.next('--');console.log(error);}      
      }
    }
    observer.complete();
  }
};

const messageReceivedCallback = (ev) => {
  message.push({"role":"assistant","message": ev.message.join('')});
  console.log(message);
};

const eventCallbacks = {
  messageReceived: messageReceivedCallback,
};

function MiniChat()  {
    return (
      <AiChat 
        initialConversation={message}
        adapter={streamAdapter}
        events={eventCallbacks} 
        personaOptions={personaOptions}
      />
    );

};

export default MiniChat;
salmenus commented 3 weeks ago

This is great. Thanks for sharing the solution @sylvain471

I'm thinking we should add examples in docs examples with Ollama