yoshitomo-matsubara / torchdistill

A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
https://yoshitomo-matsubara.net/torchdistill/
MIT License
1.37k stars 132 forks source link

[BUG] ModuleNotFoundError: No module named 'torch._six' #448

Closed nirgoren closed 6 months ago

nirgoren commented 6 months ago

Bug description When trying to import torchdistill.core.forward_hook with an up-to-date version of pytorch, you get ModuleNotFoundError: No module named 'torch._six' error.

To Reproduce

  1. Exact command to run your code: import torchdistill.core.forward_hook
  2. Whether or not you made any changes in Python code (if so, how you made the changes?): Did not make changes.
  3. YAML config file - not relevant.
  4. Log file - not relevant.

Expected behavior The import should succeed without errors.

Environment:

Additional context Related to this change in pytorch: https://github.com/pytorch/pytorch/pull/94709 Apparently string_classes is no longer needed and str can be used instead.

yoshitomo-matsubara commented 6 months ago

Hi @nirgoren ,

You're an old version of torchdistill. It is already resolved in torchdistill v1.0.0

nirgoren commented 6 months ago

Hi @nirgoren ,

You're an old version of torchdistill. It is already resolved in torchdistill v1.0.0

I see. Note that pip automatically downloaded v0.3.3 for me due to having torch v2.2.1 installed and v1.0.0 of torchdistill requiring a version of torch that is <=2.1.0. It would be nice to have compatibility with the latest version of torch, if that doesn't cause any issues.

yoshitomo-matsubara commented 6 months ago

You could still use torchdistill v1.0.0 with the latest torch if you force-install torchdistill though I've not tested it.

From torchdistill v1.0.0, I started specifying min/max versions of key requirements to avoid the same mistakes as the old torchdistill had e.g., torchdistill v0.3.3.

Next release of torchdistill supports the latest torch and is around the corner, probably next week. Stay tuned!