issues
search
airockchip
/
rknn-llm
Other
417
stars
36
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Optimization failure
#21
80Builder80
opened
7 months ago
4
能否用3588s部署?
#20
deanwintrester
closed
6 months ago
1
run模型时,为什么还需要指定max_context_len?
#19
AACengineer
closed
7 months ago
3
Expection while converting model
#18
dic1911
opened
7 months ago
4
rkllm_init error
#17
beingjoey
closed
7 months ago
4
Errors found in building kernal.
#16
sdrzmgy
closed
7 months ago
6
rkllm-toolkit/examples/test.py broken?
#15
Pelochus
closed
7 months ago
1
请问一下,样例中的大模型有没有具体的下载链接
#14
kaylorchen
closed
7 months ago
2
Low-Level Code Request
#13
fydeos-alex
opened
7 months ago
0
Need Support For INT4 Quantize
#12
fydeos-alex
closed
7 months ago
2
RKLLMParam max_context_len Maximum supported length
#11
jun-zhang
opened
7 months ago
0
转换MiniCPM失败
#10
jieli1990
closed
7 months ago
1
benchmark data
#9
WangFengtu1996
opened
7 months ago
3
Improve Work Need To Be Done
#8
fydeos-alex
opened
7 months ago
0
What is the cause of process interruption during model conversion?
#7
Han-WG
closed
7 months ago
2
编译内核时报错
#6
sdrzmgy
opened
8 months ago
3
目前支持视觉大模型或者多模态大模型吗?
#5
Han-WG
closed
7 months ago
1
E RKNN: [10:27:30.619] failed to allocate handle, ret: -1, errno: 14, errstr: Bad address Segmentation fault (core dumped)
#4
fydeos-alex
opened
8 months ago
15
Doc Misspelled
#3
fydeos-alex
closed
7 months ago
1
Phi-2 Model Load Error
#2
fydeos-alex
closed
8 months ago
2
update
#1
SoulProficiency
opened
8 months ago
0
Previous