Closed aliasboink closed 2 years ago
Hi @banciuadrian,
Thank you for the report. Could you tell me which version of torchvision
and sc2bench
you are using?
I recently made an update and resolved the issue https://github.com/yoshitomo-matsubara/sc2-benchmark/commit/1089fb19f16b5c9b1f9aa9f558750e01bd1867cc since torchvision.models.segmentation.segmentation
became no longer available after our experiments
If you use the latest torchvision (0.12.0), the issue should be resolved. Let me know if this resolves the issue on your machine.
Hello! That seems to have solved the error! Thank you for the very quick reply and thanks for developing the package.
I was using torchvision==0.11.1
as it is written in environment.yaml
(just pointing this out since you might wish to change it).
Now I'm having some CUDA issues, but this is to be expected since it's a new computer I haven't set up that well yet, so no worries there. Everything that's related to the package seems to be working.
Once again, thank you lots for the prompt reply. I'll close the issue with this comment as the solution has been found.
I'm trying to run a test following Supervised Compression. I've downloaded the dataset for Semantic Segmentation (for PASCAL VOC2012). I've set up the environment in
conda
usingenvironment.yaml
. I've placed theresource
insc2-benchmark-main
.The command I've tried running is:
The output is:
I've tried backtracking it to some extent, but didn't see anything that would help. I don't see any mention of "model_urls" in
/home/gleip/anaconda3/envs/sc2-benchmark/lib/python3.8/site-packages/torchvision/models/segmentation/deeplabv3.py
.I'm running Ubuntu 20.04.4 LTS x86_64 on a Legion 5 Pro laptop with NVidia RTX3060 and AMD Ryzen 7 5800H.
Background: I'm trying to use this library for the purpose of implementing split computing with a bottleneck injection on another kind of model, therefore I'm trying to test things out and see how the results of the paper have been achieved.
A response to this would be greatly appreciated, thanks in advance!