airockchip / rknn-llm

Other
347 stars 29 forks source link

internlm2-chat-1_8b模型转换出错 #61

Open TRQ-UP opened 4 months ago

TRQ-UP commented 4 months ago

1 转换报错:Error: Internal: src/sentencepiece_processor.cc(1101) [model_proto->ParseFromArray(serialized.data(),serialized.size())]! Catch exception when converting model! 工具版本:rkllm-tookit version:1.0.1

转换代码: modelpath = '/home/mct/rk-llm/internlm2-chat-1_8b/' llm = RKLLM()

Load model

ret = llm.load_huggingface(model=modelpath) if ret != 0: print('Load model failed!') exit(ret)

Build model

ret = llm.build(do_quantization=True,quantized_dtype='w8a8', target_platform='rk3588') if ret != 0: print('Build model failed!') exit(ret)

Export rknn model

ret = llm.export_rkllm("./internlm2.rkllm") if ret != 0: print('Export model failed!') exit(ret)