The warning is a problem when full training is running, as the log is getting huuuge (400 MB instead of 2MB)
The warning is proposing a fix: To compact weights again call flatten_parameters().
I am not sure, where to call it yet.
Full warning below:
warning:
../aten/src/ATen/native/cudnn/RNN.cpp:1278: UserWarning: RNN module weights are not part of single contiguous chunk of memory. This means they need to be compacted at every call, possibly greatly increasing memory usage. To compact weights again call flatten_parameters().
The warning is a problem when full training is running, as the log is getting huuuge (400 MB instead of 2MB)
The warning is proposing a fix:
To compact weights again call flatten_parameters().
I am not sure, where to call it yet.Full warning below:
Repro for commit https://github.com/ryanleary/mlperf-rnnt-ref/commit/4082f086ec4834886cceb927dbb1454eca44c68d: