issues
search
mgrankin
/
over9000
Over9000 optimizer
Apache License 2.0
425
stars
57
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Please update README with latest optimezers like MADGRAD and AdaBelief
#25
shkarupa-alex
closed
3 years ago
2
Performance mismatch between notebook and readme
#24
juntang-zhuang
closed
3 years ago
2
If I use RangerLars should I use any additional lr_scheduler
#23
tugot17
closed
4 years ago
1
Recommendation on optimal LR
#22
IamGianluca
closed
4 years ago
3
RangerLars
#21
sky186
closed
4 years ago
2
Update benchmarcks now that Ranger supports gradient centralization
#20
LifeIsStrange
closed
4 years ago
1
Port to fastai v2?
#19
oguiza
closed
4 years ago
6
How about RAdam + LAMB + Lookahead?
#18
Whu-wxy
closed
4 years ago
2
Create LICENSE
#17
mgrankin
closed
4 years ago
0
What do you think about weight decay?
#16
hadaev8
closed
4 years ago
1
Citing the repository
#15
kakumarabhishek
closed
4 years ago
2
What is the meaning of Imagenette and Imagewoof ?
#14
fyting
closed
5 years ago
2
LR Scheduler
#13
kakumarabhishek
closed
5 years ago
2
question about LAMB implementation.
#12
fastalgo
closed
4 years ago
7
Merge them all?
#11
LifeIsStrange
closed
5 years ago
1
Applying fix for different param_groups for RAdam
#10
redknightlois
closed
5 years ago
2
Fix bug in RAdams buffer implementation
#9
sholderbach
closed
5 years ago
2
Adding state_dict to LookAhead optimizer
#8
LeviViana
closed
5 years ago
2
Proper implementation of Ralamb
#7
redknightlois
closed
5 years ago
0
Lookahead has no attribute 'state'
#6
staylor-ds
closed
5 years ago
2
Testing that uses LAMB only
#5
Tony-Y
closed
5 years ago
2
AdaBound
#4
r1ckya
closed
4 years ago
6
Failure to execute on Windows.
#3
redknightlois
closed
4 years ago
12
Updated Ralamb Source
#2
redknightlois
closed
5 years ago
2
How to use over9000 in common pytorch code?
#1
askerlee
closed
5 years ago
10