issues
search
airockchip
/
rknn-llm
Other
240
stars
25
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Consider Open-Sourcing the Code After Nearly Two Months Without Updates 项目近两个月未更新,建议开源代码
#74
AndySze
opened
4 days ago
2
Error compiling kernel
#73
windskyxb
opened
6 days ago
0
新手求助:如何更新NPU内核版本至0.9.6
#72
xinyinan9527
opened
6 days ago
1
Support RK3566?
#71
tolidano
opened
6 days ago
0
support qwen2-7b?
#70
iamananba
opened
1 week ago
0
How to adapt the RWKV LLM?
#69
xinyinan9527
opened
1 week ago
0
Error: iommu_context->weight_memory is NULL
#68
meanshe
opened
2 weeks ago
0
Cannot find the clImportMemoryARM
#67
yylt
closed
2 weeks ago
1
Embeddings generation not supported?
#66
woutermans
opened
3 weeks ago
0
ERROR: modpost: "iommu_get_dma_cookie" [drivers/rknpu/rknpu.ko] undefined!
#65
meanshe
opened
4 weeks ago
2
internlm2_rkLLM对话效果不好
#64
shaqing
opened
1 month ago
2
qwen-1.8b模型对话结果不好
#63
shaqing
opened
1 month ago
0
跑大模型遇到问题
#62
shaqing
closed
3 weeks ago
1
internlm2-chat-1_8b模型转换出错
#61
TRQ-UP
opened
1 month ago
0
hi where is the code to download this?
#60
federicoparra
closed
1 month ago
1
是否支持 rk3582
#59
walletiger
opened
1 month ago
2
Instructions?
#58
mtwlz
opened
1 month ago
0
how to uninstall the rkllm latest (1.0.1) to install the old one (1.0.0)
#57
thanhtantran
opened
1 month ago
1
这个项目有支持多模态大模型的计划吗?比如MiniCPM-V_2.5
#56
luyu0816
opened
1 month ago
0
Specific Token Appeared in Response
#55
fydeos-alex
opened
1 month ago
0
兼容openai格式的接口
#54
smy116
opened
1 month ago
1
在板子上跑server的demo,跑不起来,帮看下什么原因
#53
luyu0816
opened
1 month ago
6
转化模型时出错
#52
17656178609
closed
1 month ago
0
为什么RKLLMResult *result, result -> num. 这个参数没给赋值啊,我想测一下每秒生成多少tokens,应该怎么做?
#51
17656178609
opened
1 month ago
1
输出全是: !!!!!!! 。这样的
#50
wmx-github
opened
1 month ago
0
可以支持多模态么
#49
17656178609
closed
3 weeks ago
3
Procedure and Model Validation(程序和模型验证)
#48
kalous12
opened
1 month ago
0
RKNN: [22:16:52.189] failed to convert handle(1020) to fd, ret: -1, errno: 24, errstr: Too many open files
#47
17656178609
closed
3 weeks ago
2
LLM models have bad output
#46
kalous12
opened
1 month ago
9
将来是否会支持 moe 模型
#45
YanxingLiu
opened
1 month ago
0
rkllm_init error in rkllm1.0.1
#44
eework
closed
1 month ago
2
E RKNN: [00:13:05.159] failed to convert handle(1020) to fd, ret: -1, errno: 24, errstr: Too many open files
#43
wmx-github
closed
1 month ago
1
Let's talk about converting and runtime 我们来谈谈转换和运行时
#42
80Builder80
opened
1 month ago
0
rkllm1.0.1 总是激活3个core
#41
AACengineer
opened
1 month ago
0
是否支持自定义rkllm的模型量化?
#40
MosRat
closed
1 month ago
3
phi3 load error in rkllm1.0.1
#39
eework
closed
1 month ago
9
flask_server.py DEMO无法正常执行
#38
westwind027
closed
1 month ago
7
Qwen1.5-0.5B模型两次回答不一致
#37
AACengineer
opened
1 month ago
10
Phi3 推理无法结束或在最后出现无效输出
#36
AndySze
closed
1 month ago
2
Need a way to stop inference midway
#35
Tidus1991
closed
1 month ago
1
模型可视化、推理性能评估
#34
AACengineer
opened
2 months ago
0
Open Source SDK
#33
80Builder80
closed
1 month ago
0
Phi-3 error
#32
dnhkng
opened
2 months ago
10
rknn_init failed
#31
YanxingLiu
opened
2 months ago
1
模型转换内存需求过大
#30
YanxingLiu
opened
2 months ago
6
Is the performance bottleneck of rknn llm in the CPU?
#29
Caical
opened
2 months ago
4
Phi 3 and Llama 3 compatibility?
#28
Pelochus
closed
1 month ago
6
firfly rk3588 rkllmcpu占用率问题
#27
Caical
opened
2 months ago
9
orangepi@orangepi5:~/rknn-llm-main/rkllm-runtime/example/build/build_linux_aarch64_Release$ taskset f0 ./llm_demo ./qwen.rkllm rkllm init start E RKNN: [07:37:38.389] failed to convert handle(1020) to fd, ret: -1, errno: 24, errstr: Too many open files Segmentation fault (core dumped)
#26
lindsayshuo
opened
2 months ago
3
请问有没有支持CLIP,GLIP,DINO等模型的计划?
#25
KunxingYang
opened
2 months ago
1
Next