issues
search
airockchip
/
rknn-llm
Other
347
stars
29
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
RKNN: [22:16:52.189] failed to convert handle(1020) to fd, ret: -1, errno: 24, errstr: Too many open files
#47
17656178609
closed
3 months ago
2
LLM models have bad output
#46
kalous12
opened
4 months ago
9
将来是否会支持 moe 模型
#45
YanxingLiu
opened
4 months ago
0
rkllm_init error in rkllm1.0.1
#44
eework
closed
4 months ago
2
E RKNN: [00:13:05.159] failed to convert handle(1020) to fd, ret: -1, errno: 24, errstr: Too many open files
#43
wmx-github
closed
4 months ago
1
Let's talk about converting and runtime 我们来谈谈转换和运行时
#42
80Builder80
opened
4 months ago
0
rkllm1.0.1 总是激活3个core
#41
AACengineer
opened
4 months ago
0
是否支持自定义rkllm的模型量化?
#40
MosRat
closed
4 months ago
3
phi3 load error in rkllm1.0.1
#39
eework
closed
4 months ago
9
flask_server.py DEMO无法正常执行
#38
westwind027
closed
4 months ago
7
Qwen1.5-0.5B模型两次回答不一致
#37
AACengineer
opened
4 months ago
10
Phi3 推理无法结束或在最后出现无效输出
#36
AndySze
closed
4 months ago
2
Need a way to stop inference midway
#35
Tidus1991
closed
4 months ago
1
模型可视化、推理性能评估
#34
AACengineer
opened
5 months ago
0
Open Source SDK
#33
80Builder80
closed
4 months ago
0
Phi-3 error
#32
dnhkng
opened
5 months ago
10
rknn_init failed
#31
YanxingLiu
opened
5 months ago
1
模型转换内存需求过大
#30
YanxingLiu
opened
5 months ago
6
Is the performance bottleneck of rknn llm in the CPU?
#29
Caical
opened
5 months ago
4
Phi 3 and Llama 3 compatibility?
#28
Pelochus
closed
4 months ago
6
firfly rk3588 rkllmcpu占用率问题
#27
Caical
opened
5 months ago
9
orangepi@orangepi5:~/rknn-llm-main/rkllm-runtime/example/build/build_linux_aarch64_Release$ taskset f0 ./llm_demo ./qwen.rkllm rkllm init start E RKNN: [07:37:38.389] failed to convert handle(1020) to fd, ret: -1, errno: 24, errstr: Too many open files Segmentation fault (core dumped)
#26
lindsayshuo
opened
5 months ago
3
请问有没有支持CLIP,GLIP,DINO等模型的计划?
#25
KunxingYang
opened
5 months ago
1
rk3588 ubuntu kernel版本为5.10 替换npu 代码后编译报错
#24
Caical
opened
5 months ago
2
能否用在3566部署?
#23
Gooddz1
opened
5 months ago
1
rk3588s 配置和支持模型的规模
#22
AACengineer
opened
5 months ago
0
Optimization failure
#21
80Builder80
opened
5 months ago
4
能否用3588s部署?
#20
deanwintrester
closed
4 months ago
1
run模型时,为什么还需要指定max_context_len?
#19
AACengineer
closed
5 months ago
3
Expection while converting model
#18
dic1911
opened
5 months ago
4
rkllm_init error
#17
beingjoey
closed
5 months ago
4
Errors found in building kernal.
#16
sdrzmgy
closed
5 months ago
6
rkllm-toolkit/examples/test.py broken?
#15
Pelochus
closed
5 months ago
1
请问一下,样例中的大模型有没有具体的下载链接
#14
kaylorchen
closed
5 months ago
2
Low-Level Code Request
#13
fydeos-alex
opened
6 months ago
0
Need Support For INT4 Quantize
#12
fydeos-alex
closed
5 months ago
2
RKLLMParam max_context_len Maximum supported length
#11
jun-zhang
opened
6 months ago
0
转换MiniCPM失败
#10
jieli1990
closed
5 months ago
1
benchmark data
#9
WangFengtu1996
opened
6 months ago
3
Improve Work Need To Be Done
#8
fydeos-alex
opened
6 months ago
0
What is the cause of process interruption during model conversion?
#7
Han-WG
closed
6 months ago
2
编译内核时报错
#6
sdrzmgy
opened
6 months ago
3
目前支持视觉大模型或者多模态大模型吗?
#5
Han-WG
closed
6 months ago
1
E RKNN: [10:27:30.619] failed to allocate handle, ret: -1, errno: 14, errstr: Bad address Segmentation fault (core dumped)
#4
fydeos-alex
opened
6 months ago
15
Doc Misspelled
#3
fydeos-alex
closed
5 months ago
1
Phi-2 Model Load Error
#2
fydeos-alex
closed
6 months ago
2
update
#1
SoulProficiency
opened
6 months ago
0
Previous