OpenBMB / BMInf

Efficient Inference for Big Models
Apache License 2.0
572 stars 67 forks source link

[QUESTION] Is it able to load complete model instead of INT8 model? #45

Closed skpig closed 2 years ago

skpig commented 2 years ago

I've already downloaded the complete model from https://wudaoai.cn. I'm willing to do inference with the complete model. Is it able to use BMInf's API (such as EVA.dialogue()) to do inference ?

a710128 commented 2 years ago

No, it is not supported now.