Pelochus / ezrknn-llm

Easier usage of LLMs in Rockchip's NPU on SBCs like Orange Pi 5 and Radxa Rock 5 series
Other
36 stars 3 forks source link
ai linux llm npu rk3588 rockchip

ezrknn-llm

This repo tries to make RKNN LLM usage easier for people who don't want to read through Rockchip's docs.

Main repo is https://github.com/Pelochus/ezrknpu where you can find more instructions, documentation... for general use. This repo is intended for details in RKLLM and also how to convert models.

Requirements

Keep in mind this repo is focused for:

Quick Install

First clone the repo:

git clone https://github.com/Pelochus/ezrknn-llm

Then run:

cd ezrknn-llm && bash install.sh

Test

Run (assuming you are on the folder where your .rkllm file is located):

rkllm qwen-chat-1_8B.rkllm # Or any other model you like

Converting LLMs for Rockchip's NPUs

Docker

In order to do this, you need a Linux PC x86 (Intel or AMD). Currently, Rockchip does not provide ARM support for converting models, so can't be done on a Orange Pi or similar. Run:

docker run -it pelochus/ezrkllm-toolkit:latest bash

Then, inside the Docker container:

cd ezrknn-llm/rkllm-toolkit/examples/huggingface/

Now change the test.py with your preferred model. This container provides Qwen-1.8B since it is the best working one and very lightweight. Before converting the model, remember to run git lfs pull to download the model. To convert the model, run:

python3 test.py

Fixing hallucinating LLMs

Check this reddit post if you LLM seems to be responding garbage:

https://www.reddit.com/r/RockchipNPU/comments/1cpngku/rknnllm_v101_lets_talk_about_converting_and/

Older versions

There are dedicated branch containing the latest commit done by this fork before updating to a newer release from Rockchip. They are also on the releases of this repo. To use the latest version, always use the main branch.

Original README starts below




Description

RKLLM software stack can help users to quickly deploy AI models to Rockchip chips. The overall framework is as follows:

In order to use RKNPU, users need to first run the RKLLM-Toolkit tool on the computer, convert the trained model into an RKLLM format model, and then inference on the development board using the RKLLM C API.

Support Platform

Support Models

Download

RKNN Toolkit2

If you want to deploy additional AI model, we have introduced a SDK called RKNN-Toolkit2. For details, please refer to:

https://github.com/airockchip/rknn-toolkit2

CHANGELOG

v1.0.1

For older version, please refer CHANGELOG