Open dhamaraiselvi opened 7 months ago
unable to resolve it. Please suggest some solution
I encountered the same problem, has it been solved? Doesn’t the official provide a solution?
same problem in branch 22.04-dev
root@32972dbe780d:~/apex# pip install -v --disable-pip-version-check --no-cache-dir --no-build-isolation --config-settings "--build-option=--cpp_ext" --config-settings "--build-option=--cuda_ext" ./
Using pip 24.1 from /usr/local/lib/python3.10/dist-packages/pip (python 3.10)
...
adding 'apex/mlp/__init__.py'
adding 'apex/mlp/mlp.py'
adding 'apex/multi_tensor_apply/__init__.py'
adding 'apex/multi_tensor_apply/multi_tensor_apply.py'
adding 'apex/normalization/__init__.py'
adding 'apex/normalization/fused_layer_norm.py'
**no fused_layer_norm_cuda here**
adding 'apex/normalization/instance_norm.py'
adding 'apex/optimizers/__init__.py'
adding 'apex/optimizers/fused_adagrad.py'
adding 'apex/optimizers/fused_adam.py'
adding 'apex/optimizers/fused_lamb.py'
adding 'apex/optimizers/fused_mixed_precision_lamb.py'
adding 'apex/optimizers/fused_novograd.py'
adding 'apex/optimizers/fused_sgd.py'
...
File "/home/dhamaraiselvi/.local/lib/python3.10/site-packages/fairseq/models/init.py", line 106, in build_model return model.build_model(cfg, task) File "/home/dhamaraiselvi/.local/lib/python3.10/site-packages/fairseq/models/transformer/transformer_legacy.py", line 133, in build_model return super().build_model(cfg, task) File "/home/dhamaraiselvi/.local/lib/python3.10/site-packages/fairseq/models/transformer/transformer_base.py", line 97, in build_model encoder = cls.build_encoder(cfg, src_dict, encoder_embed_tokens) File "/home/dhamaraiselvi/.local/lib/python3.10/site-packages/fairseq/models/transformer/transformer_legacy.py", line 143, in build_encoder return super().build_encoder( File "/home/dhamaraiselvi/.local/lib/python3.10/site-packages/fairseq/models/transformer/transformer_base.py", line 115, in build_encoder return TransformerEncoderBase(cfg, src_dict, embed_tokens) File "/home/dhamaraiselvi/.local/lib/python3.10/site-packages/fairseq/models/transformer/transformer_encoder.py", line 96, in init [self.build_encoder_layer(cfg) for i in range(cfg.encoder.layers)] File "/home/dhamaraiselvi/.local/lib/python3.10/site-packages/fairseq/models/transformer/transformer_encoder.py", line 96, in
[self.build_encoder_layer(cfg) for i in range(cfg.encoder.layers)]
File "/home/dhamaraiselvi/.local/lib/python3.10/site-packages/fairseq/models/transformer/transformer_encoder.py", line 106, in build_encoder_layer
layer = transformer_layer.TransformerEncoderLayerBase(
File "/home/dhamaraiselvi/.local/lib/python3.10/site-packages/fairseq/modules/transformer_layer.py", line 43, in init
self.self_attn_layer_norm = LayerNorm(self.embed_dim, export=cfg.export)
File "/home/dhamaraiselvi/.local/lib/python3.10/site-packages/fairseq/modules/layer_norm.py", line 32, in LayerNorm
return FusedLayerNorm(normalized_shape, eps, elementwise_affine)
File "/home/dhamaraiselvi/.local/lib/python3.10/site-packages/apex/normalization/fused_layer_norm.py", line 294, in init
fused_layer_norm_cuda = importlib.import_module("fused_layer_norm_cuda")
File "/usr/lib/python3.10/importlib/init.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "", line 1050, in _gcd_import
File "", line 1027, in _find_and_load
File "", line 1004, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'fused_layer_norm_cuda'
dhamaraiselvi@dhamaraiselvi-hp-zbook-power:~/Machine_Translation/Model_V1$ python3
Python 3.10.12 (main, Nov 20 2023, 15:14:05) [GCC 11.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.