laekov / fastmoe

A fast MoE impl for PyTorch
https://fastmoe.ai
Apache License 2.0
1.56k stars 188 forks source link

ImportError: cannot import name 'get_args' from 'megatron' #181

Open peter-fei opened 11 months ago

peter-fei commented 11 months ago

No functions named get_args() in megatron The funxitons in init doesn't include get_args

r""" A set of modules to plugin into Megatron-LM with FastMoE """ from .utils import add_fmoe_args

from .layers import MegatronMLP from .layers import fmoefy

from .checkpoint import save_checkpoint from .checkpoint import load_checkpoint

from .distributed import DistributedDataParallel

from .balance import reset_gate_hook from .balance import get_balance_profile from .balance import generate_megatron_gate_hook from .balance import add_balance_log

from .patch import patch_forward_step from .patch import patch_model_provider

laekov commented 11 months ago

What is the version of your Megatron-LM? They have being significantly changing the API in their recent versions.

peter-fei commented 11 months ago

My version 1.1.0

laekov commented 11 months ago

1.1.0 may be too old. We have verivied support for 2.2, 2.5 and 3.0.2 . If you have to use 1.1.0, you need to either modify megatron or fastmoe.

peter-fei commented 11 months ago

Thanks. By the way, How can I get a higher version?

laekov commented 11 months ago

Check out the tags here