facebookresearch / optimizers

For optimization algorithm research and development.
Other
442 stars 31 forks source link

ModuleNotFoundError: No module named 'optimizer_modules', while trying to import the DistributedShampoo class #15

Closed ExpressGradient closed 2 months ago

ExpressGradient commented 3 months ago

Facing a ModuleNotFoundError while trying to import DistributedShampoo class from distributed_shampoo.distributed_shampoo.

Here's the entire trace:

{
    "name": "ModuleNotFoundError",
    "message": "No module named 'optimizer_modules'",
    "stack": "---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
Cell In[17], line 7
      3 from torch.utils.data import Dataset, DataLoader
      5 # import tiktoken
----> 7 from distributed_shampoo.distributed_shampoo import DistributedShampoo

File ~\\softwares\\personal\\lingo\\deps\\optimizers\\distributed_shampoo\\distributed_shampoo.py:73
     26 import torch
     28 from distributed_shampoo.shampoo_types import (
     29     AdaGradGraftingConfig,
     30     AdamGraftingConfig,
   (...)
     70     WEIGHT_DECAY,
     71 )
---> 73 from distributed_shampoo.utils.shampoo_checkpoint_utils import (
     74     extract_state_dict_content,
     75     flatten,
     76     unflatten,
     77     update_param_state_dict_object,
     78 )
     79 from distributed_shampoo.utils.shampoo_ddp_distributor import DDPDistributor
     80 from distributed_shampoo.utils.shampoo_distributor import Distributor

File ~\\softwares\\personal\\lingo\\deps\\optimizers\\distributed_shampoo\\utils\\shampoo_checkpoint_utils.py:18
     15 from typing import Any, Dict, List, Union
     17 import torch
---> 18 from optimizer_modules import OptimizerModule
     21 logger: logging.Logger = logging.getLogger(__name__)
     24 def flatten(input_dict: Dict[str, Any]) -> Dict[str, Any]:

ModuleNotFoundError: No module named 'optimizer_modules'"
}
hjmshi commented 3 months ago

Hi @ExpressGradient, thanks for your interest in our code. Did you copy optimizer_modules.py file also into your repo? It looks like you can't seem to find that module. Please note that both optimizer_modules.py and matrix_functions.py are necessary in order to run the PyTorch Distributed Shampoo code.

ExpressGradient commented 2 months ago

Hi, sorry for the late reply, it seemed to work after I did a fresh install now. Thanks