Closed chienlinhuang1116 closed 8 years ago
Hi,
Get the latest version of ( https://github.com/twitter/torch-ipc ) package and try setting poolSize to something small, like 128 or so.
Thanks, Zak
Thank you Zak, it is resolved by setting "poolSize".
Hi, I apply "torch-distlearn" for the purpose of speech recognition. First, I convert and save inputs and labels into the torch file of "train.t7" by using
Then, I use “torch-dataset” to handle the input. The question is the number of batchSize is equal to the number of running threads or jobs. If I defined the “--batchSize 256”, the number of CPU usage is about 256 but there are only 32 CPU cores in my GPU machine. Is the number of batchSize related to the number of threads or jobs?
In addition, for some reasons, we cannot define the number of batchSize larger than 256 in “torch-dataset”. For example, There are errors when setting ”batchSize = 512”. Do you know the reason?
Thank you.