Open seansica opened 3 months ago
> import torch.optim.optimizer as optim
> optim.__dir__()
['__name__',
'__doc__',
'__package__',
'__loader__',
'__spec__',
'__file__',
'__cached__',
'__builtins__',
'__annotations__',
'math',
'functools',
'warnings',
'OrderedDict',
'defaultdict',
'deepcopy',
'chain',
'Any',
'Callable',
'DefaultDict',
'Dict',
'Hashable',
'Iterable',
'List',
'Optional',
'Set',
'Tuple',
'TypeVar',
'Union',
'cast',
'overload',
'ParamSpec',
'Self',
'TypeAlias',
'torch',
'hooks',
'RemovableHandle',
'Indices',
'TensorListList',
'_get_foreach_kernels_supported_devices',
'_get_fused_kernels_supported_devices',
'is_compiling',
'_group_tensors_by_device_and_dtype',
'Args',
'Kwargs',
'StateDict',
'GlobalOptimizerPreHook',
'GlobalOptimizerPostHook',
'__all__',
'_global_optimizer_pre_hooks',
'_global_optimizer_post_hooks',
'_foreach_supported_types',
'_RequiredParameter',
'required',
'_use_grad_for_differentiable',
'_get_value',
'_stack_if_compiling',
'_dispatch_sqrt',
'_default_to_fused_or_foreach',
'_view_as_real',
'_get_scalar_dtype',
'_foreach_doc',
'_fused_doc',
'_capturable_doc',
'_differentiable_doc',
'_maximize_doc',
'register_optimizer_step_pre_hook',
'register_optimizer_step_post_hook',
'ParamsT',
'_P',
'R',
'T',
'Optimizer']
Could be that params_t
needs to be changed to ParamsT
Also just noted that the issue does not occur when building from source (poetry install
). I think that suggests there is a discrepancy with the pip install, maybe an incorrectly specified dependency in [tool.poetry.dependencies]
or something like that.
Ran this from the demo code:
Throws:
This occurs in Colab.