nlkitai / nlux

The π—£π—Όπ˜„π—²π—Ώπ—³π˜‚π—Ή Conversational AI JavaScript Library πŸ’¬ β€”Β UI for any LLM, supporting LangChain / HuggingFace / Vercel AI, and more 🧑 React, Next.js, and plain JavaScript ⭐️
https://docs.nlkit.com/nlux
Other
937 stars 48 forks source link

[Documentation] - example of using nlux with on device browser llm #70

Closed FriendlyUser closed 2 weeks ago

FriendlyUser commented 1 month ago

Looking at the "in preview" documentation for in browser llms, I thought it might be of interest to a basic example of how to do so.

Once in device llms are generally available, I think it makes sense to add a basic example like:

import {ChatAdapter, StreamingAdapterObserver} from '@nlux/react';

// A demo endpoint by NLUX that connects to OpenAI
// and returns a stream of Server-Sent events
const demoProxyServerUrl = "https://demo.api.nlux.ai/openai/chat/stream";

export const streamAdapter: ChatAdapter = {
    streamText: async (
        prompt: string,
        observer: StreamingAdapterObserver,
    ) => {
        const canCreate = await window.ai.canCreateTextSession();
        // canCreate will be one of the following:
        // * "readily": the model is available on-device and so creating will happen quickly
        // * "after-download": the model is not available on-device, but the device is capable,
        //   so creating the session will start the download process (which can take a while).
        // * "no": the model is not available for this device.

        if (canCreate !== "no") {
            const session = await window.ai.createTextSession();

            // Prompt the model and stream the result:
            const stream = session.promptStreaming(prompt);
            for await (const chunk of stream) {
                observer.next(chunk);
            }
        }

        observer.complete();
    },
};
salmenus commented 2 weeks ago

Great idea @FriendlyUser

I looks at what the Chromium team are building, for in-browser LLMs and sounds very promising. It's too early to add to NLUX though, as it's still not stable.

But you can create an example and we can probably included in the demo repository rather than the main docs website. Feel free to share an example on Discord and I'll consider it for nlkitai/demos or in the examples section of docs.