Closed johnjyang closed 8 months ago
I am getting this error when trying the huggingface model. Please also make the hugging face model the default.
$ python run.py
/home/namin/mambaforge/envs/trl/lib/python3.10/site-packages/trl/trainer/ppo_config.py:141: UserWarning: The `optimize_cuda_cache` arguement will be deprecated soon, please use `optimize_device_cache` instead.
warnings.warn(
Traceback (most recent call last):
File "/home/namin/llm-verified-john/run.py", line 1, in <module>
import llm
File "/home/namin/llm-verified-john/llm.py", line 12, in <module>
_, model, tokenizer = load_model()
NameError: name 'load_model' is not defined
I am getting this error when trying the huggingface model. Please also make the hugging face model the default.
$ python run.py /home/namin/mambaforge/envs/trl/lib/python3.10/site-packages/trl/trainer/ppo_config.py:141: UserWarning: The `optimize_cuda_cache` arguement will be deprecated soon, please use `optimize_device_cache` instead. warnings.warn( Traceback (most recent call last): File "/home/namin/llm-verified-john/run.py", line 1, in <module> import llm File "/home/namin/llm-verified-john/llm.py", line 12, in <module> _, model, tokenizer = load_model() NameError: name 'load_model' is not defined
Pushed a fix - thanks for checking this part of the code. I will be able to test on a VM with an A100 soon
I am not sure I like capitalizing the config vars like LANG, etc. Can we keep them lower case? Any reason?
I am not sure I like capitalizing the config vars like LANG, etc. Can we keep them lower case? Any reason?
I capitalized them because they are constants, and also to avoid confusion (readability) with file names like "lang" and "llm"
I am getting this error now:
$ python run.py
/home/namin/mambaforge/envs/trl/lib/python3.10/site-packages/trl/trainer/ppo_config.py:141: UserWarning: The `optimize_cuda_cache` arguement will be deprecated soon, please use `optimize_device_cache` instead.
warnings.warn(
Traceback (most recent call last):
File "/home/namin/llm-verified-john/run.py", line 1, in <module>
import llm
File "/home/namin/llm-verified-john/llm.py", line 13, in <module>
_, model, tokenizer = hugginface_generate.load_model()
File "/home/namin/llm-verified-john/hugginface_generate.py", line 32, in load_model
base_model = AutoModelForCausalLMWithValueHead.from_pretrained(
File "/home/namin/mambaforge/envs/trl/lib/python3.10/site-packages/trl/models/modeling_base.py", line 205, in from_pretrained
pretrained_model = cls.transformers_parent_class.from_pretrained(
File "/home/namin/mambaforge/envs/trl/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 488, in from_pretrained
resolved_config_file = cached_file(
File "/home/namin/mambaforge/envs/trl/lib/python3.10/site-packages/transformers/utils/hub.py", line 430, in cached_file
resolved_file = hf_hub_download(
File "/home/namin/mambaforge/envs/trl/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 110, in _inner_fn
validate_repo_id(arg_value)
File "/home/namin/mambaforge/envs/trl/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 164, in validate_repo_id
raise HFValidationError(
huggingface_hub.utils._validators.HFValidationError: Repo id must use alphanumeric chars or '-', '_', '.', '--' and '..' are forbidden, '-' and '.' cannot start or end the name, max length is 96: './my_ppo_model'.
I'll let you test this further before taking another look.
Now getting
$ python run.py
/home/namin/mambaforge/envs/trl/lib/python3.10/site-packages/trl/trainer/ppo_config.py:141: UserWarning: The `optimize_cuda_cache` arguement will be deprecated soon, please use `optimize_device_cache` instead.
warnings.warn(
/home/namin/mambaforge/envs/trl/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:472: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
warnings.warn(
Loading checkpoint shards: 100%|██████████████████| 7/7 [00:45<00:00, 6.46s/it]
/home/namin/mambaforge/envs/trl/lib/python3.10/site-packages/transformers/utils/hub.py:374: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
warnings.warn(
You are using the default legacy behaviour of the <class 'transformers.models.llama.tokenization_llama.LlamaTokenizer'>. This is expected, and simply means that the `legacy` (previous) behavior will be used so nothing changes for you. If you want to use the new behaviour, set `legacy=False`. This should only be set if you understand what it means, and thouroughly read the reason why this was added as explained in https://github.com/huggingface/transformers/pull/24565
Traceback (most recent call last):
File "/home/namin/llm-verified-john/run.py", line 50, in <module>
montecarlo.simulate(expansion_count)
File "/home/namin/llm-verified-john/montecarlo/montecarlo.py", line 51, in simulate
self.expand(current_node)
File "/home/namin/llm-verified-john/montecarlo/montecarlo.py", line 54, in expand
self.child_finder(node, self)
File "/home/namin/llm-verified-john/run.py", line 34, in child_finder
text = generate_complete(node.state, montecarlo)
File "/home/namin/llm-verified-john/run.py", line 17, in generate_complete
text = llm.generate(text, 1)[0]
File "/home/namin/llm-verified-john/llm.py", line 28, in generate
return gen(prompt, model_generation_args, num)
File "/home/namin/llm-verified-john/llm.py", line 20, in gen
with torch.no_grad():
NameError: name 'torch' is not defined
$ python run_verifier_feedback.py
/home/namin/mambaforge/envs/trl/lib/python3.10/site-packages/trl/trainer/ppo_config.py:141: UserWarning: The `optimize_cuda_cache` arguement will be deprecated soon, please use `optimize_device_cache` instead.
warnings.warn(
/home/namin/mambaforge/envs/trl/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py:472: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
warnings.warn(
Loading checkpoint shards: 100%|██████████████████| 7/7 [00:56<00:00, 8.13s/it]
/home/namin/mambaforge/envs/trl/lib/python3.10/site-packages/transformers/utils/hub.py:374: FutureWarning: The `use_auth_token` argument is deprecated and will be removed in v5 of Transformers. Please use `token` instead.
warnings.warn(
You are using the default legacy behaviour of the <class 'transformers.models.llama.tokenization_llama.LlamaTokenizer'>. This is expected, and simply means that the `legacy` (previous) behavior will be used so nothing changes for you. If you want to use the new behaviour, set `legacy=False`. This should only be set if you understand what it means, and thouroughly read the reason why this was added as explained in https://github.com/huggingface/transformers/pull/24565
Traceback (most recent call last):
File "/home/namin/llm-verified-john/run_verifier_feedback.py", line 8, in <module>
from prompts import prompt, expansion_count, min_lines, check_func
File "/home/namin/llm-verified-john/prompts.py", line 214, in <module>
```{lang.lower()}
NameError: name 'lang' is not defined. Did you mean: 'range'?
Thanks!
model_config.py
andlang_config.py