I will change the compression during the training process. In this case, how should I use GRACE? The pseudo-code is shown below:
compression = A()
optimizer = hvd.DistributedOptimizer(optimizer,
compression,
named_parameters=model.named_parameters())
for epoch in range(0,100):
if epoch > 50:
compression = B()
How to apply that compression above to the DistributedOptimizer?
train()
test()
.....
Anyone could help me to solve this problem? Appreciate for your help!!!
I will change the compression during the training process. In this case, how should I use GRACE? The pseudo-code is shown below:
Anyone could help me to solve this problem? Appreciate for your help!!!