Open locta66 opened 3 years ago
omegaconf.errors.ConfigAttributeError: Key 'min_lr' is not in struct full_key: optimization.min_lr reference_type=Any object_type=dict
with fairseq 1.0.0 hydra-core 1.0.6
omegaconf.errors.ConfigAttributeError: Key 'min_lr' is not in struct full_key: optimization.min_lr reference_type=Any object_type=dict with fairseq 1.0.0 hydra-core 1.0.6
how to reproduce this? with torch 1.8.0 torchvision 0.81 fairseq 1.0.0 (install https://github.com/pytorch/fairseq master branch, with pip 1.0.0 is not reachable) hydra-core 1.0.6
then I run the preprocess and pretrain scripts. the pretrain script produce it.
omegaconf.errors.ConfigAttributeError: Key 'min_lr' is not in struct full_key: optimization.min_lr reference_type=Any object_type=dict with fairseq 1.0.0 hydra-core 1.0.6
how to reproduce this? with torch 1.8.0 torchvision 0.81 fairseq 1.0.0 (install https://github.com/pytorch/fairseq master branch, with pip 1.0.0 is not reachable) hydra-core 1.0.6
then I run the preprocess and pretrain scripts. the pretrain script produce it.
Thanks for the help, I think by 'rename optimization.min_lr -> optimization.stop_min_lr ', the bug could be fixed. There is a PR in fairseq, FYI. https://github.com/pytorch/fairseq/issues/1486
omegaconf.errors.ConfigAttributeError: Key 'min_lr' is not in struct full_key: optimization.min_lr reference_type=Any object_type=dict
with fairseq 1.0.0 hydra-core 1.0.6