IntelLabs / MART

Modular Adversarial Robustness Toolkit
BSD 3-Clause "New" or "Revised" License
16 stars 0 forks source link

Move load_state_dict into LitModular #145

Closed dxoigmn closed 1 year ago

dxoigmn commented 1 year ago

What does this PR do?

The mechanism for wrapping a module in mart.nn.load_state_dict makes YAML configuration more verbose and is unnecessary if that logic is moved into LitModular. Ideally, however, we would support custom syntax within the YAML configuration for things that do not support weight loading. For example:

model:
  _target_: torchvision.models.detection.fasterrcnn_resnet50_fpn
  _load_state_dict_: path/to/weights.pt

Type of change

Please check all relevant options.

Testing

Please describe the tests that you ran to verify your changes. Consider listing any relevant details of your test configuration.

Before submitting

Did you have fun?

Make sure you had fun coding 🙃

mzweilin commented 1 year ago

Usage example:

python -m mart \
experiment=ArmoryCarlaOverObjDet_TorchvisionFasterRCNN \
fit=false \
+trainer.limit_test_batches=1 \
+model.load_state_dict.losses_and_detections.model=/home/weilinxu/coder/GARD-with-MART/oscar/model_zoo/carla_rgb_weights_eval6.pt