vercel / modelfusion

The TypeScript library for building AI applications.
https://modelfusion.dev
MIT License
1.15k stars 82 forks source link

Ollama Llama3.1 #348

Open prpanto opened 3 months ago

prpanto commented 3 months ago

How to use the llama3.1 with Ollama? Do you support it?

pedinil commented 2 months ago

First you should see if you have your model downloaded in ollama

you can see list of all your model with below command

ollama list

NAME            ID              SIZE    MODIFIED
gemma2:latest   ff02c3702f32    5.4 GB  2 weeks ago
llama3.1:latest 62757c860e01    4.7 GB  2 weeks ago
llama3:latest   365c0bd3c000    4.7 GB  6 weeks ago

If llama3.1 is avaible than you can easily used below code to call the model

          const res = await generateText({
            model: ollama
              .ChatTextGenerator({
                model: "llama3.1",
                maxGenerationTokens: 500,
              })
              .withTextPrompt(),
            prompt: prompt,
          })