Closed hxy9243 closed 6 years ago
Thanks. Let me test this. I recently refactored a few things and updated examples. I did not change the API for ModelMGPU though so it should still work.
I was already calling the super init in line 344: https://github.com/avolkov1/keras_experiments/blob/99cd23118ad731553ef708d9eff480da35db7ec9/keras_exp/multigpu/_multigpu.py#L344
Let me test with latest Keras.
Thanks again. I tested this and verified the required fix. I'll merge it in. I added another MGPU class that's a wrapper of Keras multi_gpu_model
function that I will push soon.
class ModelKerasMGPU(Model):
'''
Wrapper class around "keras.utils.multi_gpu_model". This class enabled
loading and saving transparently.
'''
def __init__(self, ser_model, gpus): # @IgnorePep8 pylint: disable=super-init-not-called
pmodel = multi_gpu_model(ser_model, gpus)
# mimic copy constructor via __dict__ update, hence no super-init
self.__dict__.update(pmodel.__dict__)
self._smodel = ser_model
def __getattribute__(self, attrname):
'''Override load and save methods to be used from the serial-model. The
serial-model holds references to the weights in the multi-gpu model.
'''
# return Model.__getattribute__(self, attrname)
if 'load' in attrname or 'save' in attrname:
return getattr(self._smodel, attrname)
return super(ModelKerasMGPU, self).__getattribute__(attrname)
That's interesting that with this class wrapper I do not need the super init, probably because of the __dict__
.
The latest multi_gpu_model
seems to perform better than my version. The gpus
parameter is an integer or list of integers for GPU ids. You might want to test it. The ModelKerasMGPU
class wrapper is to enable model-checkpointing with doesn't work with model returned by the multi_gpu_model
function.
Good to know. Thank you!
Ran into this issue while using
keras_experiments
as a multi-GPU benchmark with newer version of keras (2.2.0).Adding this line seems to solve the issue.