When I run the sample.py, I received the following as an output:
falling back to cpu
falling back to fp32
loading parameters
loading parameters took 0.88s
and the following as an error:
python3 /lustre/scratch/x_kazlakam/progen/progen2/sample.py --model progen2-large --t 0.8 --p 0.9 --max-length 1024 --num-samples 2 --context 1
401 Client Error: Unauthorized for url: https://huggingface.co/checkpoints/progen2-large/resolve/main/config.json
Traceback (most recent call last):
File "/home/x_kazlakam/.conda/envs/progen/lib/python3.8/site-packages/transformers/configuration_utils.py", line 585, in _get_config_dict
resolved_config_file = cached_path(
File "/home/x_kazlakam/.conda/envs/progen/lib/python3.8/site-packages/transformers/file_utils.py", line 1846, in cached_path
output_path = get_from_cache(
File "/home/x_kazlakam/.conda/envs/progen/lib/python3.8/site-packages/transformers/file_utils.py", line 2050, in get_from_cache
_raise_for_status(r)
File "/home/x_kazlakam/.conda/envs/progen/lib/python3.8/site-packages/transformers/file_utils.py", line 1977, in _raise_for_status
request.raise_for_status()
File "/home/x_kazlakam/.conda/envs/progen/lib/python3.8/site-packages/requests/models.py", line 1021, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/checkpoints/progen2-large/resolve/main/config.json
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/lustre/scratch/x_kazlakam/progen/progen2/sample.py", line 207, in
main()
File "/lustre/scratch/x_kazlakam/progen/progen2/sample.py", line 145, in main
model = create_model(ckpt=ckpt, fp16=args.fp16).to(device)
File "/lustre/scratch/x_kazlakam/progen/progen2/sample.py", line 57, in create_model
return ProGenForCausalLM.from_pretrained(ckpt)
File "/home/x_kazlakam/.conda/envs/progen/lib/python3.8/site-packages/transformers/modeling_utils.py", line 1268, in from_pretrained
config, model_kwargs = cls.config_class.from_pretrained(
File "/home/x_kazlakam/.conda/envs/progen/lib/python3.8/site-packages/transformers/configuration_utils.py", line 510, in from_pretrained
config_dict, kwargs = cls.get_config_dict(pretrained_model_name_or_path, kwargs)
File "/home/x_kazlakam/.conda/envs/progen/lib/python3.8/site-packages/transformers/configuration_utils.py", line 537, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, kwargs)
File "/home/x_kazlakam/.conda/envs/progen/lib/python3.8/site-packages/transformers/configuration_utils.py", line 618, in _get_config_dict
raise EnvironmentError(
OSError: We couldn't connect to 'https://huggingface.co/' to load this model and it looks like ./checkpoints/progen2-large is not the path to a directory conaining a config.json file.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
Hello,
I was installing ProGen2 within a Conda env. I followed all the steps in the documentation, except that I changed:
python3.8 -m venv .venv
source .venv/bin/activate
withconda create -n progen -c anaconda python=3.8
source ~/Anaconda/bin/activate progen
When I run the sample.py, I received the following as an output:
and the following as an error: