ollama / ollama-js

Ollama JavaScript library
https://ollama.com
MIT License
2.27k stars 178 forks source link
javascript js ollama

Ollama JavaScript Library

The Ollama JavaScript library provides the easiest way to integrate your JavaScript project with Ollama.

Getting Started

npm i ollama

Usage

import ollama from 'ollama'

const response = await ollama.chat({
  model: 'llama3.1',
  messages: [{ role: 'user', content: 'Why is the sky blue?' }],
})
console.log(response.message.content)

Browser Usage

To use the library without node, import the browser module.

import ollama from 'ollama/browser'

Streaming responses

Response streaming can be enabled by setting stream: true, modifying function calls to return an AsyncGenerator where each part is an object in the stream.

import ollama from 'ollama'

const message = { role: 'user', content: 'Why is the sky blue?' }
const response = await ollama.chat({ model: 'llama3.1', messages: [message], stream: true })
for await (const part of response) {
  process.stdout.write(part.message.content)
}

Create

import ollama from 'ollama'

const modelfile = `
FROM llama3.1
SYSTEM "You are mario from super mario bros."
`
await ollama.create({ model: 'example', modelfile: modelfile })

API

The Ollama JavaScript library's API is designed around the Ollama REST API

chat

ollama.chat(request)

generate

ollama.generate(request)

pull

ollama.pull(request)

push

ollama.push(request)

create

ollama.create(request)

delete

ollama.delete(request)

copy

ollama.copy(request)

list

ollama.list()

show

ollama.show(request)

embed

ollama.embed(request)

ps

ollama.ps()

abort

ollama.abort()

This method will abort all streamed generations currently running with the client instance. If there is a need to manage streams with timeouts, it is recommended to have one Ollama client per stream.

All asynchronous threads listening to streams (typically the for await (const part of response)) will throw an AbortError exception. See examples/abort/abort-all-requests.ts for an example.

Custom client

A custom client can be created with the following fields:

import { Ollama } from 'ollama'

const ollama = new Ollama({ host: 'http://127.0.0.1:11434' })
const response = await ollama.chat({
  model: 'llama3.1',
  messages: [{ role: 'user', content: 'Why is the sky blue?' }],
})

Building

To build the project files run:

npm run build