I've been trying to run MultiPL-E evaluation on Mistral model but after 6 hours the process is hung and not moving forward. Rerun this multiple times, still the same. Could someone please check or let me know if I'm missing somethin?
Logs:
2024-04-02 19:08:07.633142: E external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:9261] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
2024-04-02 19:08:07.633175: E external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:9261] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
2024-04-02 19:08:07.633244: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:607] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
2024-04-02 19:08:07.633302: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:607] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
2024-04-02 19:08:07.766054: E external/local_xla/xla/stream_executor/cuda/cuda_blas.cc:1515] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
2024-04-02 19:08:07.766068: E external/local_xla/xla/stream_executor/cuda/cuda_blas.cc:1515] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
Selected Tasks: ['multiple-js']
Loading model in 4bit
/opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:466: FutureWarning: The use_auth_token argument is deprecated and will be removed in v5 of Transformers. Please use token instead.
warnings.warn(
Loading model in 4bit
/opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:466: FutureWarning: The use_auth_token argument is deprecated and will be removed in v5 of Transformers. Please use token instead.
warnings.warn(
config.json: 100%|█████████████████████████| 1.09k/1.09k [00:00<00:00, 5.31MB/s]
/opt/conda/lib/python3.10/site-packages/transformers/quantizers/auto.py:155: UserWarning: You passed quantization_config or equivalent parameters to from_pretrained but the model you're loading already has a quantization_config attribute. The quantization_config from the model will be used.
warnings.warn(warning_msg)
/opt/conda/lib/python3.10/site-packages/transformers/quantizers/auto.py:155: UserWarning: You passed quantization_config or equivalent parameters to from_pretrained but the model you're loading already has a quantization_config attribute. The quantization_config from the model will be used.
warnings.warn(warning_msg)
model.safetensors: 100%|████████████████████| 4.13G/4.13G [00:21<00:00, 190MB/s]
generation_config.json: 100%|███████████████████| 111/111 [00:00<00:00, 464kB/s]
/opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:720: FutureWarning: The use_auth_token argument is deprecated and will be removed in v5 of Transformers. Please use token instead.
warnings.warn(
/opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:720: FutureWarning: The use_auth_token argument is deprecated and will be removed in v5 of Transformers. Please use token instead.
warnings.warn(
tokenizer_config.json: 100%|███████████████████| 964/964 [00:00<00:00, 4.72MB/s]
tokenizer.model: 100%|████████████████████████| 493k/493k [00:00<00:00, 257MB/s]
tokenizer.json: 100%|██████████████████████| 1.80M/1.80M [00:00<00:00, 17.2MB/s]
special_tokens_map.json: 100%|█████████████████| 552/552 [00:00<00:00, 3.60MB/s]
generation mode only
/opt/conda/lib/python3.10/site-packages/datasets/load.py:1461: FutureWarning: The repository for nuprl/MultiPL-E contains custom code which must be executed to correctly load the dataset. You can inspect the repository content at https://hf.co/datasets/nuprl/MultiPL-E
You can avoid this message in future by passing the argument trust_remote_code=True.
Passing trust_remote_code=True will be mandatory to load this dataset from the next major release of datasets.
warnings.warn(
/opt/conda/lib/python3.10/site-packages/datasets/load.py:1461: FutureWarning: The repository for nuprl/MultiPL-E contains custom code which must be executed to correctly load the dataset. You can inspect the repository content at https://hf.co/datasets/nuprl/MultiPL-E
You can avoid this message in future by passing the argument trust_remote_code=True.
Passing trust_remote_code=True will be mandatory to load this dataset from the next major release of datasets.
warnings.warn(
Downloading builder script: 100%|██████████| 4.05k/4.05k [00:00<00:00, 19.5MB/s]
Downloading metadata: 100%|██████████████████| 478k/478k [00:00<00:00, 9.10MB/s]
Downloading readme: 100%|██████████████████| 99.6k/99.6k [00:00<00:00, 3.28MB/s]
Downloading data: 218kB [00:00, 21.2MB/s]
Generating test split: 100%|█████████| 161/161 [00:00<00:00, 5709.24 examples/s]
Downloading builder script: 100%|██████████| 4.05k/4.05k [00:00<00:00, 14.3MB/s]
Downloading metadata: 100%|██████████████████| 478k/478k [00:00<00:00, 9.82MB/s]
Downloading readme: 100%|██████████████████| 99.6k/99.6k [00:00<00:00, 4.32MB/s]
number of problems for this task is 161
0%| | 0/81 [00:00<?, ?it/s]/kaggle/working/bigcode-evaluation-harness/bigcode_eval/utils.py:119: UserWarning: n_copies (n_samples/batch_size) was changed from 1 to 2 because n_tasks isn't proportional to num devices
warnings.warn(
94it [6:43:45, 283.17s/it]
I've been trying to run MultiPL-E evaluation on Mistral model but after 6 hours the process is hung and not moving forward. Rerun this multiple times, still the same. Could someone please check or let me know if I'm missing somethin?
Command
Logs: 2024-04-02 19:08:07.633142: E external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:9261] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered 2024-04-02 19:08:07.633175: E external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:9261] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered 2024-04-02 19:08:07.633244: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:607] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered 2024-04-02 19:08:07.633302: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:607] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered 2024-04-02 19:08:07.766054: E external/local_xla/xla/stream_executor/cuda/cuda_blas.cc:1515] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered 2024-04-02 19:08:07.766068: E external/local_xla/xla/stream_executor/cuda/cuda_blas.cc:1515] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered Selected Tasks: ['multiple-js'] Loading model in 4bit /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:466: FutureWarning: The
use_auth_token
argument is deprecated and will be removed in v5 of Transformers. Please usetoken
instead. warnings.warn( Loading model in 4bit /opt/conda/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:466: FutureWarning: Theuse_auth_token
argument is deprecated and will be removed in v5 of Transformers. Please usetoken
instead. warnings.warn( config.json: 100%|█████████████████████████| 1.09k/1.09k [00:00<00:00, 5.31MB/s] /opt/conda/lib/python3.10/site-packages/transformers/quantizers/auto.py:155: UserWarning: You passedquantization_config
or equivalent parameters tofrom_pretrained
but the model you're loading already has aquantization_config
attribute. Thequantization_config
from the model will be used. warnings.warn(warning_msg) /opt/conda/lib/python3.10/site-packages/transformers/quantizers/auto.py:155: UserWarning: You passedquantization_config
or equivalent parameters tofrom_pretrained
but the model you're loading already has aquantization_config
attribute. Thequantization_config
from the model will be used. warnings.warn(warning_msg) model.safetensors: 100%|████████████████████| 4.13G/4.13G [00:21<00:00, 190MB/s] generation_config.json: 100%|███████████████████| 111/111 [00:00<00:00, 464kB/s] /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:720: FutureWarning: Theuse_auth_token
argument is deprecated and will be removed in v5 of Transformers. Please usetoken
instead. warnings.warn( /opt/conda/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py:720: FutureWarning: Theuse_auth_token
argument is deprecated and will be removed in v5 of Transformers. Please usetoken
instead. warnings.warn( tokenizer_config.json: 100%|███████████████████| 964/964 [00:00<00:00, 4.72MB/s] tokenizer.model: 100%|████████████████████████| 493k/493k [00:00<00:00, 257MB/s] tokenizer.json: 100%|██████████████████████| 1.80M/1.80M [00:00<00:00, 17.2MB/s] special_tokens_map.json: 100%|█████████████████| 552/552 [00:00<00:00, 3.60MB/s] generation mode only /opt/conda/lib/python3.10/site-packages/datasets/load.py:1461: FutureWarning: The repository for nuprl/MultiPL-E contains custom code which must be executed to correctly load the dataset. You can inspect the repository content at https://hf.co/datasets/nuprl/MultiPL-E You can avoid this message in future by passing the argumenttrust_remote_code=True
. Passingtrust_remote_code=True
will be mandatory to load this dataset from the next major release ofdatasets
. warnings.warn( /opt/conda/lib/python3.10/site-packages/datasets/load.py:1461: FutureWarning: The repository for nuprl/MultiPL-E contains custom code which must be executed to correctly load the dataset. You can inspect the repository content at https://hf.co/datasets/nuprl/MultiPL-E You can avoid this message in future by passing the argumenttrust_remote_code=True
. Passingtrust_remote_code=True
will be mandatory to load this dataset from the next major release ofdatasets
. warnings.warn( Downloading builder script: 100%|██████████| 4.05k/4.05k [00:00<00:00, 19.5MB/s] Downloading metadata: 100%|██████████████████| 478k/478k [00:00<00:00, 9.10MB/s] Downloading readme: 100%|██████████████████| 99.6k/99.6k [00:00<00:00, 3.28MB/s] Downloading data: 218kB [00:00, 21.2MB/s]Generating test split: 100%|█████████| 161/161 [00:00<00:00, 5709.24 examples/s] Downloading builder script: 100%|██████████| 4.05k/4.05k [00:00<00:00, 14.3MB/s] Downloading metadata: 100%|██████████████████| 478k/478k [00:00<00:00, 9.82MB/s] Downloading readme: 100%|██████████████████| 99.6k/99.6k [00:00<00:00, 4.32MB/s] number of problems for this task is 161 0%| | 0/81 [00:00<?, ?it/s]/kaggle/working/bigcode-evaluation-harness/bigcode_eval/utils.py:119: UserWarning: n_copies (n_samples/batch_size) was changed from 1 to 2 because n_tasks isn't proportional to num devices warnings.warn( 94it [6:43:45, 283.17s/it]