huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
133.18k stars 26.59k forks source link

Pegasus from Pytorch to tensorflow #12568

Closed karimfayed closed 3 years ago

karimfayed commented 3 years ago

I have fine-tuned PEGASUS model for abstractive summarization using this script which uses huggingface. The output model is in pytorch.

On huggingface docs the following is supposed to do the required conversion: python convert_graph_to_onnx.py --framework <pt, tf> --model bert-base-cased bert-base-cased.onnx I use colab and I ran the following command to transform my pegasus model: !python convert_graph_to_onnx.py --framework <pt, tf> --model ./results/checkpoint-4000 ./results/checkpoint-4000.onnx I keep getting the following message which is confusing as it is written in the documentation that the script convert_graph_to_onnx.py is at the root of the transformers sources: Screenshot (84)

Thank you in advance.

LysandreJik commented 3 years ago

You should update <pt, tf> to reflect the library you want to use to export the graph. Either pt or tf.

github-actions[bot] commented 3 years ago

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.