jiasenlu / vilbert_beta

470 stars 96 forks source link

No module named 'fused_layer_norm_cuda' #69

Open Xingxl2studious opened 2 years ago

Xingxl2studious commented 2 years ago

Hi! I'm running the demo, I download the pytorch_model_9.bin, and I have this problem.

`--------------------------------------------------------------------------- ModuleNotFoundError Traceback (most recent call last)

in () 65 else: 66 model = VILBertForVLTasks.from_pretrained( ---> 67 args.from_pretrained, config=config, num_labels=num_labels, default_gpu=default_gpu 68 ) 69 8 frames /content/vilbert-multi-task/vilbert/utils.py in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs) 933 934 # Instantiate model. --> 935 model = cls(config, *model_args, **model_kwargs) 936 937 if state_dict is None and not from_tf: /content/vilbert-multi-task/vilbert/vilbert.py in __init__(self, config, num_labels, dropout_prob, default_gpu) 1603 self.num_labels = num_labels 1604 -> 1605 self.bert = BertModel(config) 1606 self.dropout = nn.Dropout(dropout_prob) 1607 self.cls = BertPreTrainingHeads( /content/vilbert-multi-task/vilbert/vilbert.py in __init__(self, config) 1292 # initilize word embedding 1293 if config.model == "bert": -> 1294 self.embeddings = BertEmbeddings(config) 1295 elif config.model == "roberta": 1296 self.embeddings = RobertaEmbeddings(config) /content/vilbert-multi-task/vilbert/vilbert.py in __init__(self, config) 338 # self.LayerNorm is not snake-cased to stick with TensorFlow model variable name and be able to load 339 # any TensorFlow checkpoint file --> 340 self.LayerNorm = BertLayerNorm(config.hidden_size, eps=1e-12) 341 self.dropout = nn.Dropout(config.hidden_dropout_prob) 342 /usr/local/lib/python3.7/dist-packages/apex-0.1-py3.7.egg/apex/normalization/fused_layer_norm.py in __init__(self, normalized_shape, eps, elementwise_affine) 131 132 global fused_layer_norm_cuda --> 133 fused_layer_norm_cuda = importlib.import_module("fused_layer_norm_cuda") 134 135 if isinstance(normalized_shape, numbers.Integral): /usr/lib/python3.7/importlib/__init__.py in import_module(name, package) 125 break 126 level += 1 --> 127 return _bootstrap._gcd_import(name[level:], package, level) 128 129 /usr/lib/python3.7/importlib/_bootstrap.py in _gcd_import(name, package, level) /usr/lib/python3.7/importlib/_bootstrap.py in _find_and_load(name, import_) /usr/lib/python3.7/importlib/_bootstrap.py in _find_and_load_unlocked(name, import_) ModuleNotFoundError: No module named 'fused_layer_norm_cuda'`