kolonist-minjun / ModelZoo-PIDNet

Official compatible repo for compressing and retraining PIDNet via NetsPresso.
MIT License
0 stars 0 forks source link

[pidnet_netspresso.py] Load PIDNet and convert torch.fx #1

Open kolonist-minjun opened 1 year ago

kolonist-minjun commented 1 year ago

Convert PIDNet to torch.fx and compress it through PyNetsPresso into a single .py file.

-> Through this method, the process of convert -> compress -> restore, which was previously divided, can be performed at once, making it convenient for users.

kolonist-minjun commented 1 year ago

In the case of PIDNet, the structure used for training and the structure used for inference are different. Basically, the purpose is to increase inference speed, so let's blow the model based on inference when loading the model.

kolonist-minjun commented 1 year ago

In the case of PIDNet, a model structure is created according to model types s, m, l, and then checkpoints are called. Since the structure changes after netspresso compression, let's declare another model type so that we can load the model right away.

kolonist-minjun commented 1 year ago

In the case of PIDNet, the structure used for training and the structure used for inference are different. Basically, the purpose is to increase inference speed, so let's blow the model based on inference when loading the model.

It is okay to compress in the form of inference, but if the model itself is extracted in the form of inference, the train cannot be performed. Set the model structure as train, proceed with compression as inference, and finally restore the model in train form.