Closed isaaccorley closed 8 months ago
Looking forward to test this update!
Have you checked if you can use timm convenext_nano
encoder with this PR?
Looking forward to test this update!
Have you checked if you can use timm
convenext_nano
encoder with this PR?
Yes this PR allows the use of convnext as the encoder!
@notprime Can you give this a review and maybe give some thoughts?
Looking forward to this update...
@JulienMaille @ogencoglu @notprime These features are now merged and you can install them using
pip install --pre torchseg
or pip install 'torchseg==0.0.1a2'
This PR seeks to do a few things:
pretrainedmodels
andefficientnet-pytorch
as they are no longer maintained.mock
andtorchvision
dependency (torchvision isn't used anywhere anyway)torchseg.encoders.TimmEncoder
)torchseg.encoders.supported
). This includes ConvNext and Swin pretrained backbonesThere's some other misc cleanup that is done:
Activation
class. We instead let the user choose the head activation when creating a model. Still defaults tonn.Identity()
encoder_params
totimm.create_model(**kwargs)
in case users want to further customize the backbone