Closed gventuri closed 9 months ago
Hey! Make sure you are logged in to the hub on my-company
. For me this works:
>>> from transformers import AutoModel
>>> model = AutoModel.from_pretrained("bert-base-uncase")
>>> model.push_to_hub("ArthurZ/my-llm")
CommitInfo(commit_url='https://huggingface.co/ArthurZ/my-llm/commit/53cb603f0cdd1d6284f21c541f06a3b8a0ddd2d1', commit_message='Upload model', commit_description='', oid='53cb603f0cdd1d6284f21c541f06a3b8a0ddd2d1', pr_url=None, pr_revision=None, pr_num=None)
@ArthurZucker I tried to login both with
!huggingface-cli login --token $hf_token
and with
huggingface_hub.login
In both cases I provided my personal token (I'm admin in the company). If I try to login again, it says that I'm already connected.
cc @Wauplin you might know better than me what this represents!
@gventuri Can it be that you have a local directory called "my-company/my-llm"
and that it conflicts with the push_to_hub?
(to confirm that you are indeed logged in, you can do huggingface-cli whoami
but it really doesn't seem to be the problem here)
@Wauplin thanks a lot, it was conflicting with the local folder, it's now working!
Great to know your problem's solved! :hugs:
I'm having the same error. I checked, and I also have a local directory that is the same name as my HuggingFace path (which I am assuming is causing the conflict). Do you have any guidance for remedying this?
I initially tried changing the name of the local path that it saved to, but then I got a different error. It looks like whatever path I pass in to model.push_to_hub
it tries to access that same path locally, even though I defined them differently. For reference, I am using QLoRA + ORPO:
base_model = "meta-llama/Meta-Llama-3-8B"
new_model = "XXX/TEST_ORPO_LLaMA3"
local_path = "models"
...
trainer.save_model(local_path)
del trainer, model
gc.collect()
torch.cuda.empty_cache()
tokenizer = AutoTokenizer.from_pretrained(base_model)
fp16_model = AutoModelForCausalLM.from_pretrained(
base_model,
low_cpu_mem_usage=True,
return_dict=True,
torch_dtype=torch.float16,
device_map="auto",
)
fp16_model, tokenizer = setup_chat_format(fp16_model, tokenizer)
model = PeftModel.from_pretrained(fp16_model, local_path)
model = model.merge_and_unload()
model.push_to_hub(new_model, use_temp_dir=False)
tokenizer.push_to_hub(new_model, use_temp_dir=False)
FileNotFoundError: [Errno 2] No such file or directory: 'TEST_ORPO_LLaMA3'
EDIT: It seems that even though the base model is merged with the adapters in the Colab environment, the call to model.push_to_hub attempts to retrieve the model locally (i.e., from the folder w/ the saved trained adapters) from the same path as the repo that I am trying to push to on HuggingFace Hub.
hey, just stumbled on this issue, any solution to this please?
System Info
transformers
version: 4.37.0.dev0Who can help?
@ArthurZucker @younesbelkada
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
m = AutoModelForCausalLM.from_pretrained( "pretrained-model", return_dict=True, torch_dtype=torch.bfloat16, device_map="auto" )
m.push_to_hub("my-company/my-llm")
IsADirectoryError Traceback (most recent call last) Cell In[5], line 1 ----> 1 m.push_to_hub("my-company/my-llm")
File /opt/conda/lib/python3.10/site-packages/transformers/modeling_utils.py:2530, in PreTrainedModel.push_to_hub(self, *args, *kwargs) 2528 if tags: 2529 kwargs["tags"] = tags -> 2530 return super().push_to_hub(args, **kwargs)
File /opt/conda/lib/python3.10/site-packages/transformers/utils/hub.py:865, in PushToHubMixin.push_to_hub(self, repo_id, use_temp_dir, commit_message, private, token, max_shard_size, create_pr, safe_serialization, revision, commit_description, tags, **deprecated_kwargs) 860 repo_id = self._create_repo( 861 repo_id, private=private, token=token, repo_url=repo_url, organization=organization 862 ) 864 # Create a new empty model card and eventually tag it --> 865 model_card = create_and_tag_model_card( 866 repo_id, tags, token=token, ignore_metadata_errors=ignore_metadata_errors 867 ) 869 if use_temp_dir is None: 870 use_temp_dir = not os.path.isdir(working_dir)
File /opt/conda/lib/python3.10/site-packages/transformers/utils/hub.py:1120, in create_and_tag_model_card(repo_id, tags, token, ignore_metadata_errors) 1104 """ 1105 Creates or loads an existing model card and tags it. 1106 (...) 1116 the process. Use it at your own risk. 1117 """ 1118 try: 1119 # Check if the model card is present on the remote repo -> 1120 model_card = ModelCard.load(repo_id, token=token, ignore_metadata_errors=ignore_metadata_errors) 1121 except EntryNotFoundError: 1122 # Otherwise create a simple model card from template 1123 model_description = "This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated."
File /opt/conda/lib/python3.10/site-packages/huggingface_hub/repocard.py:185, in RepoCard.load(cls, repo_id_or_path, repo_type, token, ignore_metadata_errors) 182 raise ValueError(f"Cannot load RepoCard: path not found on disk ({repo_id_or_path}).") 184 # Preserve newlines in the existing file. --> 185 with card_path.open(mode="r", newline="", encoding="utf-8") as f: 186 return cls(f.read(), ignore_metadata_errors=ignore_metadata_errors)
File /opt/conda/lib/python3.10/pathlib.py:1119, in Path.open(self, mode, buffering, encoding, errors, newline) 1117 if "b" not in mode: 1118 encoding = io.text_encoding(encoding) -> 1119 return self._accessor.open(self, mode, buffering, encoding, errors, 1120 newline)
IsADirectoryError: [Errno 21] Is a directory: 'my-company/my-llm'