Closed Chriskuei closed 5 years ago
Merging #729 into 2.2-dev will decrease coverage by
0.15%
. The diff coverage is92.62%
.
@@ Coverage Diff @@
## 2.2-dev #729 +/- ##
===========================================
- Coverage 94.09% 93.93% -0.16%
===========================================
Files 91 99 +8
Lines 3115 3481 +366
===========================================
+ Hits 2931 3270 +339
- Misses 184 211 +27
Impacted Files | Coverage Δ | |
---|---|---|
matchzoo/optimizers/multi_optimizer.py | 87.5% <87.5%> (ø) |
|
matchzoo/optimizers/multi_adam.py | 88.13% <88.13%> (ø) |
|
matchzoo/optimizers/multi_sgd.py | 91.11% <91.11%> (ø) |
|
matchzoo/optimizers/multi_adagrad.py | 93.18% <93.18%> (ø) |
|
matchzoo/optimizers/multi_rmsprop.py | 93.18% <93.18%> (ø) |
|
matchzoo/optimizers/multi_adadelta.py | 93.87% <93.87%> (ø) |
|
matchzoo/optimizers/multi_adamax.py | 94.23% <94.23%> (ø) |
|
matchzoo/optimizers/multi_nadam.py | 96.49% <96.49%> (ø) |
|
... and 6 more |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update a7c07ff...abae72f. Read the comment docs.
should be a pr to keras rather than matchzoo.
This PR implements optimizers with learning rate multipliers, which can be used to apply different learning rates for different layers. For example:
In this example, parameter
multipliers
indicates that layers whose name containdense_1
have a learning rate multiplier of 0.8 with respect to global learning rate.Related issue
728