cocktailpeanut / fluxgym

Dead simple FLUX LoRA training UI with LOW VRAM support
1.07k stars 85 forks source link

Error while removing corrupted file #159

Open NicSTT opened 2 weeks ago

NicSTT commented 2 weeks ago

I keep getting the follow error, installing manually or via the one click install.

To get round it, could I put the downloaded file in the required folder?

If so, where would that be?

A new version of the following files was downloaded from https://huggingface.co/multimodalart/Florence-2-large-no-flash-attn:

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "C:\fluxgym\env\lib\site-packages\transformers\modeling_utils.py", line 3557, in from_pretrained resolved_archive_file = cached_file( File "C:\fluxgym\env\lib\site-packages\transformers\utils\hub.py", line 402, in cached_file resolved_file = hf_hub_download( File "C:\fluxgym\env\lib\site-packages\huggingface_hub\utils_deprecation.py", line 101, in inner_f return f(*args, *kwargs) File "C:\fluxgym\env\lib\site-packages\huggingface_hub\utils_validators.py", line 114, in _inner_fn return fn(args, **kwargs) File "C:\fluxgym\env\lib\site-packages\huggingface_hub\file_download.py", line 1240, in hf_hub_download return _hf_hub_download_to_cache_dir( File "C:\fluxgym\env\lib\site-packages\huggingface_hub\file_download.py", line 1389, in _hf_hub_download_to_cache_dir _download_to_tmp_and_move( File "C:\fluxgym\env\lib\site-packages\huggingface_hub\file_download.py", line 1915, in _download_to_tmp_and_move http_get( File "C:\fluxgym\env\lib\site-packages\huggingface_hub\file_download.py", line 534, in http_get raise RuntimeError( RuntimeError: An error occurred while downloading using hf_transfer. Consider disabling HF_HUB_ENABLE_HF_TRANSFER for better error handling.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "C:\fluxgym\env\lib\site-packages\gradio\queueing.py", line 536, in process_events response = await route_utils.call_process_api( File "C:\fluxgym\env\lib\site-packages\gradio\route_utils.py", line 322, in call_process_api output = await app.get_blocks().process_api( File "C:\fluxgym\env\lib\site-packages\gradio\blocks.py", line 1935, in process_api result = await self.call_function( File "C:\fluxgym\env\lib\site-packages\gradio\blocks.py", line 1532, in call_function prediction = await utils.async_iteration(iterator) File "C:\fluxgym\env\lib\site-packages\gradio\utils.py", line 671, in async_iteration return await iterator.anext() File "C:\fluxgym\env\lib\site-packages\gradio\utils.py", line 664, in anext return await anyio.to_thread.run_sync( File "C:\fluxgym\env\lib\site-packages\anyio\to_thread.py", line 56, in run_sync return await get_async_backend().run_sync_in_worker_thread( File "C:\fluxgym\env\lib\site-packages\anyio_backends_asyncio.py", line 2405, in run_sync_in_worker_thread return await future File "C:\fluxgym\env\lib\site-packages\anyio_backends_asyncio.py", line 914, in run result = context.run(func, *args) File "C:\fluxgym\env\lib\site-packages\gradio\utils.py", line 647, in run_sync_iterator_async return next(iterator) File "C:\fluxgym\env\lib\site-packages\gradio\utils.py", line 809, in gen_wrapper response = next(iterator) File "C:\fluxgym\app.py", line 278, in run_captioning model = AutoModelForCausalLM.from_pretrained( File "C:\fluxgym\env\lib\site-packages\transformers\models\auto\auto_factory.py", line 559, in from_pretrained return model_class.from_pretrained( File "C:\fluxgym\env\lib\site-packages\transformers\modeling_utils.py", line 3644, in from_pretrained raise EnvironmentError( OSError: Can't load the model for 'multimodalart/Florence-2-large-no-flash-attn'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'multimodalart/Florence-2-large-no-flash-attn' is the correct path to a directory containing a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack.

NicSTT commented 2 weeks ago

I managed a work-around as no matter what I tried, reinstalling, different PC, it always got stuck then failed with the same message; never more than about 4 or 5% downloaded.

As with a lot of the files related to AI art, they reside in the .cache file under Users (Windows).

C:\Users\YOUR_NAME \.cache\huggingface\hub\models--multimodalart--Florence-2-large-no-flash-attn\snapshots\LONG_ALPHA_NUMERIC_FOLDER_NAME

EG: 8asd7dsf5asdf0a97asdf5as3df701asdfasf5asd7 (not mine)

In that folder, I copied a manually downloaded version, which worked after Gym downloaded a couple of missing/corrupted files, so worth a try if you have similar issues.

AbinJilson commented 4 days ago

I keep getting a similar error , but while downloading the flux model::

download flux-dev flux1-dev.sft: 0%| | 0.00/23.8G [00:00<?, ?B/s] flux1-dev.sft: 10%|████▍ | 2.49G/23.8G [16:34<2:22:09, 2.50MB/s] Traceback (most recent call last): File "C:\pinokio\api\fluxgym.git\env\lib\site-packages\huggingface_hub\file_download.py", line 523, in http_get hf_transfer.download( Exception: Error while removing corrupted file: The process cannot access the file because it is being used by another process. (os error 32)

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "C:\pinokio\api\fluxgym.git\env\lib\site-packages\gradio\queueing.py", line 536, in process_events response = await route_utils.call_process_api( File "C:\pinokio\api\fluxgym.git\env\lib\site-packages\gradio\route_utils.py", line 322, in call_process_api output = await app.get_blocks().process_api( File "C:\pinokio\api\fluxgym.git\env\lib\site-packages\gradio\blocks.py", line 1935, in process_api
result = await self.call_function( File "C:\pinokio\api\fluxgym.git\env\lib\site-packages\gradio\blocks.py", line 1532, in call_function prediction = await utils.async_iteration(iterator) File "C:\pinokio\api\fluxgym.git\env\lib\site-packages\gradio\utils.py", line 671, in async_iteration return await iterator.anext() File "C:\pinokio\api\fluxgym.git\env\lib\site-packages\gradio\utils.py", line 664, in anext
return await anyio.to_thread.run_sync( File "C:\pinokio\api\fluxgym.git\env\lib\site-packages\anyio\to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread( File "C:\pinokio\api\fluxgym.git\env\lib\site-packages\anyio_backends_asyncio.py", line 2441, in run_sync_in_worker_thread return await future File "C:\pinokio\api\fluxgym.git\env\lib\site-packages\anyio_backends_asyncio.py", line 943, in run result = context.run(func, args) File "C:\pinokio\api\fluxgym.git\env\lib\site-packages\gradio\utils.py", line 647, in run_sync_iterator_async return next(iterator) File "C:\pinokio\api\fluxgym.git\env\lib\site-packages\gradio\utils.py", line 809, in gen_wrapper
response = next(iterator) File "C:\pinokio\api\fluxgym.git\app.py", line 586, in start_training download(base_model) File "C:\pinokio\api\fluxgym.git\app.py", line 340, in download hf_hub_download(repo_id=repo, local_dir=unet_folder, filename=model_file) File "C:\pinokio\api\fluxgym.git\env\lib\site-packages\huggingface_hub\utils_deprecation.py", line 101, in inner_f return f(
args, *kwargs) File "C:\pinokio\api\fluxgym.git\env\lib\site-packages\huggingface_hub\utils_validators.py", line 114, in _inner_fn return fn(args, **kwargs) File "C:\pinokio\api\fluxgym.git\env\lib\site-packages\huggingface_hub\file_download.py", line 1220, in hf_hub_download return _hf_hub_download_to_local_dir( File "C:\pinokio\api\fluxgym.git\env\lib\site-packages\huggingface_hub\file_download.py", line 1515, in _hf_hub_download_to_local_dir _download_to_tmp_and_move( File "C:\pinokio\api\fluxgym.git\env\lib\site-packages\huggingface_hub\file_download.py", line 1915, in _download_to_tmp_and_move http_get( File "C:\pinokio\api\fluxgym.git\env\lib\site-packages\huggingface_hub\file_download.py", line 534, in http_get raise RuntimeError( RuntimeError: An error occurred while downloading using hf_transfer. Consider disabling HF_HUB_ENABLE_HF_TRANSFER for better error handling.