google-research / tapas

End-to-end neural table-text understanding models.
Apache License 2.0
1.15k stars 217 forks source link

Gradient Accumulation #110

Closed kamalkraj closed 3 years ago

kamalkraj commented 3 years ago

Why there is negative sign in

if gradient_accumulation_steps > 1:
    optimizer = GradientAccumulationOptimizer(
        optimizer,
        steps=-gradient_accumulation_steps,
        grad_clipping=grad_clipping)

optimization_test.py runs without a negative sign.