I wrote a quick PoC to showcase that you can easily have integration with the 🤗 hub so that you can automatically load the various PGTFormer model using from_pretrained (and push it using push_to_hub), track download numbers for your models (similar to models in the Transformers library), and have nice model cards on a per-model basis. It leverages the PyTorchModelHubMixin class which allows to inherits these methods.
Usage is as follows:
from archs.pgtformer_arch import PGTFormer
# define model
network = PGTFormer(...)
# equip with weights
model.load_state_dict(...)
# push to the hub
model.push_to_hub("your-hf-username-or-organization/pgtformer-base")
# reload
model = PGTFormer.from_pretrained("your-hf-username-or-organization/pgtformer-base")
This means people don't need to manually download a checkpoint first in their local environment, it just loads automatically from the hub.
Would you be interested in this integration?
Kind regards,
Niels
Note
Please don't merge this PR before pushing a model to the hub :)
Hi @kepengxu,
Thanks for this nice work! I see the checkpoints are currently on Google Drive, this PR aims to make your models discoverable from https://huggingface.co/models?pipeline_tag=image-feature-extraction.
I wrote a quick PoC to showcase that you can easily have integration with the 🤗 hub so that you can automatically load the various PGTFormer model using
from_pretrained
(and push it usingpush_to_hub
), track download numbers for your models (similar to models in the Transformers library), and have nice model cards on a per-model basis. It leverages the PyTorchModelHubMixin class which allows to inherits these methods.Usage is as follows:
This means people don't need to manually download a checkpoint first in their local environment, it just loads automatically from the hub.
Would you be interested in this integration?
Kind regards,
Niels
Note
Please don't merge this PR before pushing a model to the hub :)