XiangLi1999 / PrefixTuning

Prefix-Tuning: Optimizing Continuous Prompts for Generation
868 stars 158 forks source link

PyTorch Lightning Version? #19

Closed ekoenitz closed 2 years ago

ekoenitz commented 2 years ago

What version of PyTorch Lightning was this built with? I followed the setup instructions to install the requirements, but I keep get errors from misnamed parameters in the seq2seq module (the gpt-2 module works fine). I can fix the errors as they come up by consulting the current PyTorch Lightning documentation (filepath in the trace should be dirpath for example), but I'd rather use the code as written instead of manually updating it.

Traceback (most recent call last): File "finetune.py", line 876, in main(args) File "finetune.py", line 782, in main checkpoint_callback=get_checkpoint_callback(args.output_dir, model.val_metric, args.save_top_k, lower_is_better), #LISA File "/workspace/PrefixTuning/seq2seq/callbacks.py", line 105, in get_checkpoint_callback period=0, # maybe save a checkpoint every time val is run, not just end of epoch. TypeError: init() got an unexpected keyword argument 'filepath'

XiangLi1999 commented 2 years ago

try pip install pytorch-lightning==0.9.0

ekoenitz commented 2 years ago

Thanks for the prompt response!