hazdzz / STGCN

The PyTorch implementation of STGCN.
GNU Lesser General Public License v2.1
484 stars 106 forks source link

confusion about scheduler.step() #18

Closed 1940653868 closed 1 year ago

1940653868 commented 1 year ago

It seems that the position of "scheduler.step()" is error in main.py, which leads to the learning-rate decrese too fast…… here is the code: ` for x, y in tqdm.tqdm(train_iter):

        optimizer.step()

        scheduler.step()
        `
hazdzz commented 1 year ago

So, do you have any proposal to fix this issue?

haewonc commented 1 year ago
for epoch in range(args.epochs):
    for x, y in tqdm.tqdm(train_iter):
        ...
        optimizer.step()
    scheduler.step()

I believe the scheduler.step() should be outside the mini-batch loop, as you set the step_size = 10. Alternatively we can set step_size as a multiple of len(train_iter). Also, as you set gamma=0.95 and 0.95^100 = 0.0059, number of epochs or the step_size might be adjusted e.g. step_size = 10 & epochs = 1000 or step_size = 100 & epochs = 10000

1940653868 commented 1 year ago

Thanks for your reply. I think that's right. 

。。 @.***

 

------------------ 原始邮件 ------------------ 发件人: "Minji @.>; 发送时间: 2023年5月10日(星期三) 下午3:56 收件人: @.>; 抄送: @.>; @.>; 主题: Re: [hazdzz/STGCN] confusion about scheduler.step() (Issue #18)

for epoch in range(args.epochs): for x, y in tqdm.tqdm(train_iter): ... optimizer.step() scheduler.step()
I believe the scheduler.step() should be outside the mini-batch loop, as you set the step_size = 10. Alternatively we can set step_size as a multiple of len(train_iter). Also, as you set gamma=0.95 and 0.95^100 = 0.0059, number of epochs or the step_size might be adjusted e.g. step_size = 10 & epochs = 1000 or step_size = 100 & epochs = 10000

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>