run-llama / LlamaIndexTS

LlamaIndex in TypeScript
https://ts.llamaindex.ai
MIT License
1.58k stars 309 forks source link
agent anthr chatbot claude claude-ai create-llama embedding firewo groq-ai javascript llama llama-index llama2 llama3 llm mistr nodejs openai react typescript

LlamaIndex.TS

NPM Version NPM License NPM Downloads Discord

LlamaIndex is a data framework for your LLM application.

Use your own data with large language models (LLMs, OpenAI ChatGPT and others) in Typescript and Javascript.

Documentation: https://ts.llamaindex.ai/

Try examples online:

Open in Stackblitz

What is LlamaIndex.TS?

LlamaIndex.TS aims to be a lightweight, easy to use set of libraries to help you integrate large language models into your applications with your own data.

Multiple JS Environment Support

LlamaIndex.TS supports multiple JS environments, including:

For now, browser support is limited due to the lack of support for AsyncLocalStorage-like APIs

Getting started

npm install llamaindex
pnpm install llamaindex
yarn add llamaindex
jsr install @llamaindex/core

Node.js

import fs from "fs/promises";
import { Document, VectorStoreIndex } from "llamaindex";

async function main() {
  // Load essay from abramov.txt in Node
  const essay = await fs.readFile(
    "node_modules/llamaindex/examples/abramov.txt",
    "utf-8",
  );

  // Create Document object with essay
  const document = new Document({ text: essay });

  // Split text and create embeddings. Store them in a VectorStoreIndex
  const index = await VectorStoreIndex.fromDocuments([document]);

  // Query the index
  const queryEngine = index.asQueryEngine();
  const response = await queryEngine.query({
    query: "What did the author do in college?",
  });

  // Output response
  console.log(response.toString());
}

main();
# `pnpm install tsx` before running the script
node --import tsx ./main.ts

React Server Component (Next.js, Waku, Redwood.JS...)

First, you will need to add a llamaindex plugin to your Next.js project.

// next.config.js
const withLlamaIndex = require("llamaindex/next");

module.exports = withLlamaIndex({
  // your next.js config
});

You can combine ai with llamaindex in Next.js with RSC (React Server Components).

// src/apps/page.tsx
"use client";
import { chatWithAgent } from "@/actions";
import type { JSX } from "react";
import { useFormState } from "react-dom";

// You can use the Edge runtime in Next.js by adding this line:
// export const runtime = "edge";

export default function Home() {
  const [ui, action] = useFormState<JSX.Element | null>(async () => {
    return chatWithAgent("hello!", []);
  }, null);
  return (
    <main>
      {ui}
      <form action={action}>
        <button>Chat</button>
      </form>
    </main>
  );
}
// src/actions/index.ts
"use server";
import { createStreamableUI } from "ai/rsc";
import { OpenAIAgent } from "llamaindex";
import type { ChatMessage } from "llamaindex/llm/types";

export async function chatWithAgent(
  question: string,
  prevMessages: ChatMessage[] = [],
) {
  const agent = new OpenAIAgent({
    tools: [
      // ... adding your tools here
    ],
  });
  const responseStream = await agent.chat({
    stream: true,
    message: question,
    chatHistory: prevMessages,
  });
  const uiStream = createStreamableUI(<div>loading...</div>);
  responseStream
    .pipeTo(
      new WritableStream({
        start: () => {
          uiStream.update("response:");
        },
        write: async (message) => {
          uiStream.append(message.response.delta);
        },
      }),
    )
    .catch(console.error);
  return uiStream.value;
}

Playground

Check out our NextJS playground at https://llama-playground.vercel.app/. The source is available at https://github.com/run-llama/ts-playground

Core concepts for getting started:

Tips when using in non-Node.js environments

When you are importing llamaindex in a non-Node.js environment(such as React Server Components, Cloudflare Workers, etc.) Some classes are not exported from top-level entry file.

The reason is that some classes are only compatible with Node.js runtime,(e.g. PDFReader) which uses Node.js specific APIs(like fs, child_process, crypto).

If you need any of those classes, you have to import them instead directly though their file path in the package. Here's an example for importing the PineconeVectorStore class:

import { PineconeVectorStore } from "llamaindex/storage/vectorStore/PineconeVectorStore";

As the PDFReader is not working with the Edge runtime, here's how to use the SimpleDirectoryReader with the LlamaParseReader to load PDFs:

import { SimpleDirectoryReader } from "llamaindex/readers/SimpleDirectoryReader";
import { LlamaParseReader } from "llamaindex/readers/LlamaParseReader";

export const DATA_DIR = "./data";

export async function getDocuments() {
  const reader = new SimpleDirectoryReader();
  // Load PDFs using LlamaParseReader
  return await reader.loadData({
    directoryPath: DATA_DIR,
    fileExtToReader: {
      pdf: new LlamaParseReader({ resultType: "markdown" }),
    },
  });
}

Note: Reader classes have to be added explictly to the fileExtToReader map in the Edge version of the SimpleDirectoryReader.

You'll find a complete example with LlamaIndexTS here: https://github.com/run-llama/create_llama_projects/tree/main/nextjs-edge-llamaparse

Supported LLMs:

Contributing:

We are in the very early days of LlamaIndex.TS. If you’re interested in hacking on it with us check out our contributing guide

Bugs? Questions?

Please join our Discord! https://discord.com/invite/eN6D2HQ4aX