withcatai / node-llama-cpp

Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level
https://node-llama-cpp.withcat.ai
MIT License
887 stars 86 forks source link

Support for Llama 3 #207

Closed clvnthe04 closed 5 months ago

clvnthe04 commented 5 months ago

Feature Description

Llama 3 support

The Solution

Allow for llama 3 support

Considered Alternatives

na

Additional Context

No response

Related Features to This Feature Request

Are you willing to resolve this issue by submitting a Pull Request?

No, I don’t have the time and I’m okay to wait for the community / maintainers to resolve this issue.

giladgd commented 5 months ago

Llama 3 is already supported in version 3 beta