lllyasviel / Fooocus

Focus on prompting and generating
GNU General Public License v3.0
40k stars 5.52k forks source link

Default huggingface transformers cache directory may not be writable on Windows #1143

Open Hummenix opened 9 months ago

Hummenix commented 9 months ago

Describe the problem When running entry_with_update.py on Windows 10 an error related to the transformers cache directory may appear. There was a problem when trying to write in your cache folder (C:\Users\USERNAME/.cache\huggingface\hub). You should set the environment variable TRANSFORMERS_CACHE to a writable directory.

This can be fixed by manually setting a cache directory via the TRANSFORMERS_CACHE environment variable at the beginning of entry_with_update.py or another suitable location, such as os.environ['TRANSFORMERS_CACHE'] = '.\.cache'.

Full Console Log

Y:\ImageGen>.\python_embeded\python.exe -s Fooocus\entry_with_update.py --directml
Already up-to-date
Update succeeded.
[System ARGV] ['Fooocus\\entry_with_update.py', '--directml']
Python 3.10.9 (tags/v3.10.9:1dd9be6, Dec  6 2022, 20:01:21) [MSC v.1934 64 bit (AMD64)]
Fooocus version: 2.1.824
Running on local URL:  http://127.0.0.1:7865

To create a public link, set `share=True` in `launch()`.
Using directml with device:
Total VRAM 1024 MB, total RAM 16349 MB
Set vram state to: NORMAL_VRAM
Disabling smart memory management
Device: privateuseone
VAE dtype: torch.float32
Using sub quadratic optimization for cross attention, if you have memory or speed issues try using: --use-split-cross-attention
There was a problem when trying to write in your cache folder (C:\Users\USERNAME/.cache\huggingface\hub). You should set the environment variable TRANSFORMERS_CACHE to a writable directory.
Hummenix commented 9 months ago

So I spent a bit of time crafting a more mature solution to the issue. At the bottom of this comment you will find the console output when running the program.

Here is the code I wrote:

if platform.system() == "Windows":
    print("Windows detected. Assigning cache directory to Transformers in AppData\Local.")
    transformers_cache_directory = os.path.join(os.getenv('LOCALAPPDATA'), 'transformers_cache')
    if not os.path.exists(transformers_cache_directory):
        try:
            os.mkdir(transformers_cache_directory)
            print(f"First launch. Directory '{transformers_cache_directory}' created successfully.")
        except OSError as e:
            print(f"Error creating directory '{transformers_cache_directory}': {e}")
    else:
        print(f"Directory '{transformers_cache_directory}' already exists.")
    os.environ['TRANSFORMERS_CACHE'] = transformers_cache_directory
    print("Environment variable assigned.")
    del transformers_cache_directory

else:
    print("Windows not detected. Assignment of Transformers cache directory not necessary.")

I implemented this code in launch.py within the prepare_environment() function, directly above the return statement.

The terminal logs will follow. I am using the diff language tag to highlight important changes. Before:

[INSTALL_DIRECTORY]>.\python_embeded\python.exe -s Fooocus\entry_with_update.py --directml
Already up-to-date
Update succeeded.
[System ARGV] ['Fooocus\\entry_with_update.py', '--directml']
Python 3.10.9 (tags/v3.10.9:1dd9be6, Dec  6 2022, 20:01:21) [MSC v.1934 64 bit (AMD64)]
Fooocus version: 2.1.824
Running on local URL:  http://127.0.0.1:7865

To create a public link, set `share=True` in `launch()`.
Using directml with device:
Total VRAM 1024 MB, total RAM 16349 MB
Set vram state to: NORMAL_VRAM
Disabling smart memory management
Device: privateuseone
VAE dtype: torch.float32
Using sub quadratic optimization for cross attention, if you have memory or speed issues try using: --use-split-cross-attention
- There was a problem when trying to write in your cache folder (C:\Users\USERNAME/.cache\huggingface\hub). You should set the environment variable TRANSFORMERS_CACHE to a writable directory.
Refiner unloaded.
model_type EPS
adm 2816

After:

[INSTALL_DIRECTORY]>.\python_embeded\python.exe -s Fooocus\entry_with_update.py --directml
Already up-to-date
Update succeeded.
[System ARGV] ['Fooocus\\entry_with_update.py', '--directml']
Python 3.10.9 (tags/v3.10.9:1dd9be6, Dec  6 2022, 20:01:21) [MSC v.1934 64 bit (AMD64)]
Fooocus version: 2.1.824
+ Windows detected. Assigning cache directory to Transformers in AppData\Local.
+ Directory 'C:\Users\USERNAME\AppData\Local\transformers_cache' already exists.
+ Environment variable assigned.
Running on local URL:  http://127.0.0.1:7865

To create a public link, set `share=True` in `launch()`.
Using directml with device:
Total VRAM 1024 MB, total RAM 16349 MB
Set vram state to: NORMAL_VRAM
Disabling smart memory management
Device: privateuseone
VAE dtype: torch.float32
Using sub quadratic optimization for cross attention, if you have memory or speed issues try using: --use-split-cross-attention
Refiner unloaded.
model_type EPS
adm 2816