Closed Explorer1092 closed 5 years ago
Yes, this is normal. All of the training data is loaded into RAM at once, it isn't line buffered or read from disk and processed with a python generator. Take a look at utils.load_dataset(...)
to see exactly what's going on.
When the train.txt file is large (2G), train.py uses a lot of memory. Is this situation normal?