ultralytics / ultralytics

NEW - YOLOv8 🚀 in PyTorch > ONNX > OpenVINO > CoreML > TFLite
https://docs.ultralytics.com
GNU Affero General Public License v3.0
28.33k stars 5.63k forks source link

Albumentation Augmentations not working during multi-gpu training. #15755

Open ShubhamNagarkar opened 3 weeks ago

ShubhamNagarkar commented 3 weeks ago

Search before asking

Question

Hi, I am training an object detection YOLOv8 medium model on my custom dataset (42k images) which contains 3 classes. I wanted to add blur and image compression augmentations from albumentations, so I followed a solution to modify the init function from the Albumentations class in the augment.py file. Everything works fine when I use single GPU to train the model.

Here is the MonkeyPatching solution to use Albumentations: Issue title: How to use albumentations?
Link: github.com/ultralytics/ultralytics/issues/257

But as soon as I set the GPUs to [0,1,2,3] I can see that the albumentations augmentations are not applied. Any reason why it works on single GPU but not on Multi-GPU? Please let me know if there's any workaround for this.

Additional

No response

Y-T-G commented 3 weeks ago

You should modify the file instead of monkey patching it since DDP training launches processes by creating a new temporary training script which wouldn't use your monkey patched code.

mpperez3 commented 1 week ago

I had detected the same issue and this https://github.com/ultralytics/ultralytics/issues/257#issuecomment-1925045780 does not work in a multigpu environment

glenn-jocher commented 1 week ago

@mpperez3 for multi-GPU training, ensure you're modifying the actual source file instead of using monkey patching, as DDP processes may not apply those changes.

mpperez3 commented 1 week ago

@glenn-jocher Thank you so much for the support and suggestion!! I understand that modifying the source code directly is always a viable option, but in this scenario, it's not the best choice if we want to maintain compatibility with future versions of ultralytics. However, monkey patching provides a more flexible alternative, allowing us to make the necessary changes without losing the ability to upgrade easily.

glenn-jocher commented 1 week ago

You're welcome! While monkey patching offers flexibility, it may not work with multi-GPU due to DDP processes. Direct modification ensures consistency across all GPUs. Consider maintaining a separate branch for compatibility with future updates.