unslothai / unsloth

Finetune Llama 3.2, Mistral, Phi, Qwen 2.5 & Gemma LLMs 2-5x faster with 80% less memory
https://unsloth.ai
Apache License 2.0
18.59k stars 1.3k forks source link

Unable to load locally located adapter #1124

Closed yurkoff-mv closed 1 month ago

yurkoff-mv commented 1 month ago

The code makes a forced call to HF.

        if SUPPORTS_LLAMA32:
            # New transformers need to check manually.
            files = HfFileSystem(token = token).glob(os.path.join(model_name, "*.json"))
            files = (os.path.split(x)[-1] for x in files)
            if sum(x == "adapter_config.json" or x == "config.json" for x in files) >= 2:
                both_exist = True
            pass
        pass

If the adapter is not in the HF repository, this will result in an error.

Sneakr commented 1 month ago

@yurkoff-mv Did you update Unsloth to the latest version?

https://github.com/unslothai/unsloth/commit/79a2112ca4a775ce0b3cb75f5074136cb54ea6df

yurkoff-mv commented 1 month ago

Updated, saw the changes, everything worked! Thank you very much!