wangzhaode / llm-export

llm-export can export llm model to onnx.
Apache License 2.0
187 stars 21 forks source link

int8量化 #37

Closed Vincent131499 closed 4 months ago

Vincent131499 commented 4 months ago

看介绍导出mnn模型默认是ibt4,请问怎么指定int8量化呢?

wangzhaode commented 4 months ago

参数太多了没有作为参数,可以修改这里:

https://github.com/wangzhaode/llm-export/blob/76fd3d9af5074112f21bfc2dbeec115398e4a027/llm_export.py#L79