CAMMA-public / rendezvous

A transformer-inspired neural network for surgical action triplet recognition from laparoscopic videos.
Other
24 stars 8 forks source link

Code Error - Cannot Run Successfully #5

Closed lihaoliu-cambridge closed 2 years ago

lihaoliu-cambridge commented 2 years ago

Thanks for sharing your code.

However, this repo is problematic. I follow your instruction to run it. The code cannot be run directly, with the following error.

Traceback (most recent call last): File "run.py", line 417, in header3 = " LR Config: Init: {} | Peak: {} | Warmup Epoch: {} | Rise: {} | Decay {} | train params {} | all params {} ".format([float(f'{sch.get_last_lr()[0]:.6f}') for sch in lr_schedulers], [float(f'{v:.6f}') for v in wp_lr], warmups, power, decay_rate, pytorch_train_params, pytorch_total_params) File "run.py", line 417, in header3 = " LR Config: Init: {} | Peak: {} | Warmup Epoch: {} | Rise: {} | Decay {} | train params {} | all params {} ".format([float(f'{sch.get_last_lr()[0]:.6f}') for sch in lr_schedulers], [float(f'{v:.6f}') for v in wp_lr], warmups, power, decay_rate, pytorch_train_params, pytorch_total_params) File "/XXX/python3.7/site-packages/torch/optim/lr_scheduler.py", line 99, in get_last_lr return self._last_lr AttributeError: 'SequentialLR' object has no attribute '_last_lr'

When I solve it follow the instruction in https://github.com/CAMMA-public/rendezvous/issues/2#issuecomment-1120167221, another print error appears.

Traceback (most recent call last): File "run.py", line 434, in print("Traning | lr: {} | epoch {}".format([lr.get_last_lr() for lr in lr_schedulers], epoch), end=" | ", file=open(logfile, 'a+')) File "run.py", line 434, in print("Traning | lr: {} | epoch {}".format([lr.get_last_lr() for lr in lr_schedulers], epoch), end=" | ", file=open(logfile, 'a+')) File "XXX/python3.7/site-packages/torch/optim/lr_scheduler.py", line 99, in get_last_lr return self._last_lr AttributeError: 'SequentialLR' object has no attribute '_last_lr'

nwoyecid commented 2 years ago

I am not able to provide you a detailed debug at this period because I am busy with another project but a quick look at your error shows that it has to do with your torch installation. This is coming from the optimizer's learning scheduler. Install higher version and the error will be gone.

If you can't upgrade your torch, simply remove learning rates from all the print statements and your code will run without error.

On Sat, 16 Jul 2022, 19:01 Liu Lihao, @.***> wrote:

Thanks for sharing your code.

However, this repo is problematic. I follow your instruction to run it. The code cannot be run directly, with the following error.

Traceback (most recent call last): File "run.py", line 417, in header3 = "** LR Config: Init: {} | Peak: {} | Warmup Epoch: {} | Rise: {} | Decay {} | train params {} | all params {}

".format([float(f'{sch.get_last_lr()[0]:.6f}') for sch in lr_schedulers], [float(f'{v:.6f}') for v in wp_lr], warmups, power, decay_rate, pytorch_train_params, pytorch_total_params) File "run.py", line 417, in header3 = " LR Config: Init: {} | Peak: {} | Warmup Epoch: {} | Rise: {} | Decay {} | train params {} | all params {} **".format([float(f'{sch.get_last_lr()[0]:.6f}') for sch in lr_schedulers], [float(f'{v:.6f}') for v in wp_lr], warmups, power, decay_rate, pytorch_train_params, pytorch_total_params) File "/XXX/python3.7/site-packages/torch/optim/lr_scheduler.py", line 99, in get_last_lr return self._last_lr AttributeError: 'SequentialLR' object has no attribute '_last_lr'

When I solve it follow the instruction in #2 (comment) https://github.com/CAMMA-public/rendezvous/issues/2#issuecomment-1120167221, another print error appears.

Traceback (most recent call last): File "run.py", line 434, in print("Traning | lr: {} | epoch {}".format([lr.get_last_lr() for lr in lr_schedulers], epoch), end=" | ", file=open(logfile, 'a+')) File "run.py", line 434, in print("Traning | lr: {} | epoch {}".format([lr.get_last_lr() for lr in lr_schedulers], epoch), end=" | ", file=open(logfile, 'a+')) File "XXX/python3.7/site-packages/torch/optim/lr_scheduler.py", line 99, in get_last_lr return self._last_lr AttributeError: 'SequentialLR' object has no attribute '_last_lr'

— Reply to this email directly, view it on GitHub https://github.com/CAMMA-public/rendezvous/issues/5, or unsubscribe https://github.com/notifications/unsubscribe-auth/AEOK5UWNKQNYAKV6BI2PAVLVULTF7ANCNFSM53YLITYA . You are receiving this because you are subscribed to this thread.Message ID: @.***>