laekov / fastmoe

A fast MoE impl for PyTorch
https://fastmoe.ai
Apache License 2.0
1.52k stars 184 forks source link

Update readme-cn.md #203

Closed HelloWorldLTY closed 5 months ago

HelloWorldLTY commented 5 months ago

Hi, I think there is a typo in the chinese document. According to the english document, the disributed expert is enabled:

The distributed expert feature is enabled by default. If you want to disable it, pass environment variable USE_NCCL=0 to the setup script.

But in the Chinese document, it seems that the default mode is described as "disabled". I think it is a typo. Thanks.