issues
search
airockchip
/
rknn-llm
Other
417
stars
36
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Support RK3566?
#71
tolidano
opened
4 months ago
0
support qwen2-7b?
#70
iamananba
opened
4 months ago
2
How to adapt the RWKV LLM?
#69
xinyinan9527
opened
5 months ago
0
Error: iommu_context->weight_memory is NULL
#68
meanshe
closed
1 month ago
1
Cannot find the clImportMemoryARM
#67
yylt
closed
5 months ago
1
Embeddings generation not supported?
#66
woutermans
opened
5 months ago
1
ERROR: modpost: "iommu_get_dma_cookie" [drivers/rknpu/rknpu.ko] undefined!
#65
meanshe
opened
5 months ago
3
internlm2_rkLLM对话效果不好
#64
shaqing
opened
5 months ago
2
qwen-1.8b模型对话结果不好
#63
shaqing
opened
5 months ago
4
跑大模型遇到问题
#62
shaqing
closed
5 months ago
3
internlm2-chat-1_8b模型转换出错
#61
TRQ-UP
closed
1 month ago
1
hi where is the code to download this?
#60
federicoparra
closed
5 months ago
1
是否支持 rk3582
#59
walletiger
closed
4 weeks ago
3
Instructions?
#58
mtwlz
opened
5 months ago
0
how to uninstall the rkllm latest (1.0.1) to install the old one (1.0.0)
#57
thanhtantran
opened
5 months ago
1
这个项目有支持多模态大模型的计划吗?比如MiniCPM-V_2.5
#56
luyu0816
opened
5 months ago
5
Specific Token Appeared in Response
#55
fydeos-alex
opened
6 months ago
0
兼容openai格式的接口
#54
smy116
closed
4 months ago
1
在板子上跑server的demo,跑不起来,帮看下什么原因
#53
luyu0816
opened
6 months ago
9
转化模型时出错
#52
17656178609
closed
6 months ago
0
为什么RKLLMResult *result, result -> num. 这个参数没给赋值啊,我想测一下每秒生成多少tokens,应该怎么做?
#51
17656178609
closed
1 month ago
2
输出全是: !!!!!!! 。这样的
#50
wmx-github
opened
6 months ago
0
可以支持多模态么
#49
17656178609
closed
5 months ago
3
Procedure and Model Validation(程序和模型验证)
#48
kalous12
opened
6 months ago
0
RKNN: [22:16:52.189] failed to convert handle(1020) to fd, ret: -1, errno: 24, errstr: Too many open files
#47
17656178609
closed
5 months ago
2
LLM models have bad output
#46
kalous12
opened
6 months ago
9
将来是否会支持 moe 模型
#45
YanxingLiu
opened
6 months ago
0
rkllm_init error in rkllm1.0.1
#44
eework
closed
6 months ago
2
E RKNN: [00:13:05.159] failed to convert handle(1020) to fd, ret: -1, errno: 24, errstr: Too many open files
#43
wmx-github
closed
6 months ago
1
Let's talk about converting and runtime 我们来谈谈转换和运行时
#42
80Builder80
opened
6 months ago
0
rkllm1.0.1 总是激活3个core
#41
AACengineer
closed
1 month ago
1
是否支持自定义rkllm的模型量化?
#40
MosRat
closed
5 months ago
3
phi3 load error in rkllm1.0.1
#39
eework
closed
6 months ago
9
flask_server.py DEMO无法正常执行
#38
westwind027
closed
6 months ago
7
Qwen1.5-0.5B模型两次回答不一致
#37
AACengineer
opened
6 months ago
10
Phi3 推理无法结束或在最后出现无效输出
#36
AndySze
closed
6 months ago
2
Need a way to stop inference midway
#35
Tidus1991
closed
6 months ago
1
模型可视化、推理性能评估
#34
AACengineer
opened
6 months ago
0
Open Source SDK
#33
80Builder80
closed
6 months ago
0
Phi-3 error
#32
dnhkng
opened
6 months ago
10
rknn_init failed
#31
YanxingLiu
opened
6 months ago
1
模型转换内存需求过大
#30
YanxingLiu
opened
6 months ago
6
Is the performance bottleneck of rknn llm in the CPU?
#29
Caical
opened
7 months ago
4
Phi 3 and Llama 3 compatibility?
#28
Pelochus
closed
6 months ago
6
firfly rk3588 rkllmcpu占用率问题
#27
Caical
opened
7 months ago
9
orangepi@orangepi5:~/rknn-llm-main/rkllm-runtime/example/build/build_linux_aarch64_Release$ taskset f0 ./llm_demo ./qwen.rkllm rkllm init start E RKNN: [07:37:38.389] failed to convert handle(1020) to fd, ret: -1, errno: 24, errstr: Too many open files Segmentation fault (core dumped)
#26
lindsayshuo
opened
7 months ago
3
请问有没有支持CLIP,GLIP,DINO等模型的计划?
#25
KunxingYang
opened
7 months ago
1
rk3588 ubuntu kernel版本为5.10 替换npu 代码后编译报错
#24
Caical
opened
7 months ago
2
能否用在3566部署?
#23
Gooddz1
opened
7 months ago
1
rk3588s 配置和支持模型的规模
#22
AACengineer
opened
7 months ago
0
Previous
Next