Open garbit opened 1 year ago
I'm trying to run the model on AWS Inferentia (inf1 hardware) for model deployment however I can't actually seem to get the optimum-cli neuron tooling to work.
Has anyone had similar experience?
optimum-cli export neuron --model /root/multilingual_debiased-0b549669.ckpt --task token-classification --batch_size 30 --sequence_length 512 --auto_cast matmul --auto_cast_type bf16 multilingual_debiased-0b549669
I'm trying to run the model on AWS Inferentia (inf1 hardware) for model deployment however I can't actually seem to get the optimum-cli neuron tooling to work.
Has anyone had similar experience?