overwindows / PALM

PALM: Pre-training an Autoencoding & Autoregressive Language Model for Context-conditioned Generation
https://arxiv.org/abs/2004.07159
34 stars 3 forks source link

Config Error #2

Open locta66 opened 3 years ago

locta66 commented 3 years ago

omegaconf.errors.ConfigAttributeError: Key 'min_lr' is not in struct full_key: optimization.min_lr reference_type=Any object_type=dict

with fairseq 1.0.0 hydra-core 1.0.6

overwindows commented 3 years ago

omegaconf.errors.ConfigAttributeError: Key 'min_lr' is not in struct full_key: optimization.min_lr reference_type=Any object_type=dict

with fairseq 1.0.0 hydra-core 1.0.6

how to reproduce this?

locta66 commented 3 years ago

omegaconf.errors.ConfigAttributeError: Key 'min_lr' is not in struct full_key: optimization.min_lr reference_type=Any object_type=dict with fairseq 1.0.0 hydra-core 1.0.6

how to reproduce this? with torch 1.8.0 torchvision 0.81 fairseq 1.0.0 (install https://github.com/pytorch/fairseq master branch, with pip 1.0.0 is not reachable) hydra-core 1.0.6

then I run the preprocess and pretrain scripts. the pretrain script produce it.

overwindows commented 3 years ago

omegaconf.errors.ConfigAttributeError: Key 'min_lr' is not in struct full_key: optimization.min_lr reference_type=Any object_type=dict with fairseq 1.0.0 hydra-core 1.0.6

how to reproduce this? with torch 1.8.0 torchvision 0.81 fairseq 1.0.0 (install https://github.com/pytorch/fairseq master branch, with pip 1.0.0 is not reachable) hydra-core 1.0.6

then I run the preprocess and pretrain scripts. the pretrain script produce it.

Thanks for the help, I think by 'rename optimization.min_lr -> optimization.stop_min_lr ', the bug could be fixed. There is a PR in fairseq, FYI. https://github.com/pytorch/fairseq/issues/1486