LambdaLabsML / examples

Deep Learning Examples
MIT License
805 stars 103 forks source link

YAML Config num_workers not updating DataLoader #76

Open aleksajovanovic opened 10 months ago

aleksajovanovic commented 10 months ago

Im trying to run through the pokemon example and I'm using 2x A6000 GPUs. I keep getting this warning:

/opt/conda/lib/python3.10/site-packages/pytorch_lightning/trainer/data_loading.py:105: UserWarning: The dataloader, val dataloader 0, does not have many workers which may be a bottleneck. Consider increasing the value of thenum_workersargument (try 32 which is the number of cpus on this machine) in the DataLoader init to improve performance.`

Ive tried changing num_workers in the YAML and hardcoding it in the DataModuleFromConfig class but the warning still comes up.

Any ideas how I can fix this?