The repository provides code for running inference with the Meta Segment Anything Model 2 (SAM 2), links for downloading the trained model checkpoints, and example notebooks that show how to use the model.
Apache License 2.0
12.14k
stars
1.1k
forks
source link
Error in call to target 'torch.optim.adamw.AdamW' #420
Hello, I am facing this error even though in the config file I have defined the optim like this:
optim:
amp:
enabled: true
amp_dtype: bfloat16
optimizer:
target: torch.optim.AdamW
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/torch/multiprocessing/spawn.py", line 90, in _wrap
fn(i, *args)
File "/netscratch/naseem/sam2/training/train.py", line 40, in single_proc_run
trainer = instantiate(cfg.trainer, recursive=False)
File "/usr/local/lib/python3.10/dist-packages/hydra/_internal/instantiate/_instantiate2.py", line 226, in instantiate
return instantiate_node(
File "/usr/local/lib/python3.10/dist-packages/hydra/_internal/instantiate/_instantiate2.py", line 366, in instantiate_node
cfg[key] = instantiate_node(
File "/usr/local/lib/python3.10/dist-packages/hydra/_internal/instantiate/_instantiate2.py", line 366, in instantiate_node
cfg[key] = instantiate_node(
File "/usr/local/lib/python3.10/dist-packages/hydra/_internal/instantiate/_instantiate2.py", line 347, in instantiate_node
return _call_target(target, partial, args, kwargs, full_key)
File "/usr/local/lib/python3.10/dist-packages/hydra/_internal/instantiate/_instantiate2.py", line 97, in _call_target
raise InstantiationException(msg) from e
hydra.errors.InstantiationException: Error in call to target 'torch.optim.adamw.AdamW':
TypeError("AdamW.init() missing 1 required positional argument: 'params'")
full_key: trainer.optim.optimizer
Hello, I am facing this error even though in the config file I have defined the optim like this: optim: amp: enabled: true amp_dtype: bfloat16 optimizer: target: torch.optim.AdamW
Error Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/hydra/_internal/instantiate/_instantiate2.py", line 92, in _call_target return target(*args, **kwargs) TypeError: AdamW.init() missing 1 required positional argument: 'params'
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/torch/multiprocessing/spawn.py", line 90, in _wrap fn(i, *args) File "/netscratch/naseem/sam2/training/train.py", line 40, in single_proc_run trainer = instantiate(cfg.trainer, recursive=False) File "/usr/local/lib/python3.10/dist-packages/hydra/_internal/instantiate/_instantiate2.py", line 226, in instantiate return instantiate_node( File "/usr/local/lib/python3.10/dist-packages/hydra/_internal/instantiate/_instantiate2.py", line 366, in instantiate_node cfg[key] = instantiate_node( File "/usr/local/lib/python3.10/dist-packages/hydra/_internal/instantiate/_instantiate2.py", line 366, in instantiate_node cfg[key] = instantiate_node( File "/usr/local/lib/python3.10/dist-packages/hydra/_internal/instantiate/_instantiate2.py", line 347, in instantiate_node return _call_target(target, partial, args, kwargs, full_key) File "/usr/local/lib/python3.10/dist-packages/hydra/_internal/instantiate/_instantiate2.py", line 97, in _call_target raise InstantiationException(msg) from e hydra.errors.InstantiationException: Error in call to target 'torch.optim.adamw.AdamW': TypeError("AdamW.init() missing 1 required positional argument: 'params'") full_key: trainer.optim.optimizer