awslabs / keras-apache-mxnet

[DEPRECATED] Amazon Deep Learning's Keras with Apache MXNet support
https://github.com/awslabs/keras-apache-mxnet/wiki
Other
290 stars 65 forks source link

Selecting different GPUs to train different networks #172

Closed NTNguyen13 closed 5 years ago

NTNguyen13 commented 5 years ago

I really like the low memory usage of MXNet, I have 4 GPUs on the same computer, I want to use each of them for a different network training, so I can train 4 networks simultaneously. How can I utilize this feature in Keras with MXNet backend? Currently, when I initiate another training, they will still train on the same GPU and that hinders the performance

roywei commented 5 years ago

Hi @NTNguyen13 , you can specify by passing a context param during compile. this is only for mxnet backend. it's a list of strings, you can specify any gpu available for the model. For example, using 2 GPUs for model1 and 2 GPUs for model2. With this, you don't need to use Keras multi_gpu_modelAPI

model1.compile(loss=keras.losses.categorical_crossentropy,
              optimizer=keras.optimizers.Adadelta(),
              metrics=['accuracy'], context= ["gpu(0)", "gpu(1)"])

model2.compile(loss=keras.losses.categorical_crossentropy,
              optimizer=keras.optimizers.Adadelta(),
              metrics=['accuracy'], context= ["gpu(2)", "gpu(3)"])
NTNguyen13 commented 5 years ago

Hi @roywei, it works wonder, thank you very much