Closed SleepProgger closed 5 years ago
This PR exposes the onnx.optimzer and the do_constant_folding parameter of torch.onnx.export to be able to optimize the model after exporting.
torch.onnx.export
The use_optimizer parameter of pytorch_to_keras can be one of:
use_optimizer
pytorch_to_keras
onnx.optimizer.get_available_passes()
There are only 3 optimizer i had problems with:
extract_constant_to_initializer
split_init
split_predict
Both do_constant_folding and use_optimizer default to False.
do_constant_folding
Yes, I think it's a good idea to optimize graph before calling the converter.
This PR exposes the onnx.optimzer and the do_constant_folding parameter of
torch.onnx.export
to be able to optimize the model after exporting.The
use_optimizer
parameter ofpytorch_to_keras
can be one of:onnx.optimizer.get_available_passes()
to list all available)There are only 3 optimizer i had problems with:
extract_constant_to_initializer
which doesn't seem to work well together with onnx2keras, but that should be fixable.split_init
which lead to an output being a numpy array. Might be fixable in onnx2kerassplit_predict
which completly destroyed the model and i am not really sure why to be honest.Both
do_constant_folding
anduse_optimizer
default to False.