issues
search
Mrpatekful
/
swats
Unofficial implementation of Switching from Adam to SGD optimization in PyTorch.
MIT License
65
stars
18
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
How to adjust lr when optimizer changed from Adam to SGD? Thank you !
#19
ilplmm123
opened
1 year ago
0
Bump certifi from 2019.3.9 to 2022.12.7
#18
dependabot[bot]
opened
1 year ago
0
Bump pillow from 6.0.0 to 9.3.0
#17
dependabot[bot]
opened
2 years ago
0
Bump numpy from 1.16.4 to 1.22.0
#16
dependabot[bot]
opened
2 years ago
0
Bump pillow from 6.0.0 to 9.0.1
#15
dependabot[bot]
closed
2 years ago
1
Bump pillow from 6.0.0 to 9.0.0
#14
dependabot[bot]
closed
2 years ago
1
How to adjust the lr with step?
#13
Geek-lixiang
opened
3 years ago
0
Bump pillow from 6.0.0 to 8.3.2
#12
dependabot[bot]
closed
2 years ago
1
Bump pillow from 6.0.0 to 8.2.0
#11
dependabot[bot]
closed
3 years ago
1
Bump urllib3 from 1.25.3 to 1.26.5
#10
dependabot[bot]
opened
3 years ago
0
Bump urllib3 from 1.25.3 to 1.25.8
#9
dependabot[bot]
closed
3 years ago
1
Bump pygments from 2.4.2 to 2.7.4
#8
dependabot[bot]
opened
3 years ago
0
Bump pillow from 6.0.0 to 8.1.1
#7
dependabot[bot]
closed
3 years ago
1
Bump bleach from 3.1.0 to 3.3.0
#6
dependabot[bot]
opened
3 years ago
0
Bump bleach from 3.1.0 to 3.1.4
#5
dependabot[bot]
closed
3 years ago
1
Bump bleach from 3.1.0 to 3.1.2
#4
dependabot[bot]
closed
4 years ago
1
Bump bleach from 3.1.0 to 3.1.1
#3
dependabot[bot]
closed
4 years ago
1
Training After Switch to SGD is Flat
#2
noiran78
opened
4 years ago
4
Bump pillow from 6.0.0 to 6.2.0
#1
dependabot[bot]
closed
3 years ago
1