mrhan1993 / Fooocus-API

FastAPI powered API for Fooocus
GNU General Public License v3.0
511 stars 137 forks source link

Bug 运行一段时间后会卡住 #187

Closed aiwillcoming closed 5 months ago

aiwillcoming commented 5 months ago

一直刷 [Task Queue] Already waiting for 210.29503600206226S [Task Queue] Already waiting for 210.29503600206226S [Task Queue] Already waiting for 210.29503600206226S [Task Queue] Already waiting for 210.29503600206226S [Task Queue] Already waiting for 210.29503600206226S

konieshadow commented 5 months ago

估计哪里有异常,可以找找错误日志吗?

aiwillcoming commented 5 months ago

在哪找

aiwillcoming commented 5 months ago

日志全被这个刷了

aiwillcoming commented 5 months ago

Worker error: [enforce fail at ..\c10\core\impl\alloc_cpu.cpp:72] data. DefaultCPUAllocator: not enough memory: you tried to allocate 26214400 bytes. ERROR:root:[enforce fail at ..\c10\core\impl\alloc_cpu.cpp:72] data. DefaultCPUAllocator: not enough memory: you tried to allocate 26214400 bytes. Traceback (most recent call last): File "C:\Users\admin\Desktop\Fooocus-API\fooocusapi\worker.py", line 388, in process_generate pipeline.refresh_everything(refiner_model_name=refiner_model_name, base_model_name=base_model_name, File "C:\Users\admin\AppData\Local\Programs\Python\Python311\Lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context return func(*args, kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\admin\AppData\Local\Programs\Python\Python311\Lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context return func(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\admin\Desktop\Fooocus-API\repositories\Fooocus\modules\default_pipeline.py", line 233, in refresh_everything refresh_base_model(base_model_name) File "C:\Users\admin\AppData\Local\Programs\Python\Python311\Lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context return func(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\admin\AppData\Local\Programs\Python\Python311\Lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context return func(*args, kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\admin\Desktop\Fooocus-API\repositories\Fooocus\modules\default_pipeline.py", line 69, in refresh_base_model model_base = core.load_model(filename) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\admin\AppData\Local\Programs\Python\Python311\Lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context return func(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\admin\AppData\Local\Programs\Python\Python311\Lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context return func(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\admin\Desktop\Fooocus-API\repositories\Fooocus\modules\core.py", line 151, in load_model unet, clip, vae, clip_vision = load_checkpoint_guess_config(ckpt_filename, embedding_directory=path_embeddings) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\admin\Desktop\Fooocus-API\repositories\Fooocus\ldm_patched\modules\sd.py", line 469, in load_checkpoint_guess_config clip = CLIP(clip_target, embedding_directory=embedding_directory) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\admin\Desktop\Fooocus-API\repositories\Fooocus\ldm_patched\modules\sd.py", line 101, in init self.cond_stage_model = clip((params)) ^^^^^^^^^^^^^^^^ File "C:\Users\admin\Desktop\Fooocus-API\repositories\Fooocus\ldm_patched\modules\sdxl_clip.py", line 41, in init self.clip_g = SDXLClipG(device=device, dtype=dtype) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\admin\Desktop\Fooocus-API\repositories\Fooocus\ldm_patched\modules\sdxl_clip.py", line 12, in init super().init(device=device, freeze=freeze, layer=layer, layer_idx=layer_idx, textmodel_json_config=textmodel_json_config, dtype=dtype, File "C:\Users\admin\Desktop\Fooocus-API\repositories\Fooocus\modules\patch_clip.py", line 83, in patched_SDClipModelinit self.transformer = CLIPTextModel(config) ^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\admin\AppData\Local\Programs\Python\Python311\Lib\site-packages\transformers\models\clip\modeling_clip.py", line 782, in init self.text_model = CLIPTextTransformer(config) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\admin\AppData\Local\Programs\Python\Python311\Lib\site-packages\transformers\models\clip\modeling_clip.py", line 700, in init self.encoder = CLIPEncoder(config) ^^^^^^^^^^^^^^^^^^^ File "C:\Users\admin\AppData\Local\Programs\Python\Python311\Lib\site-packages\transformers\models\clip\modelingclip.py", line 585, in init self.layers = nn.ModuleList([CLIPEncoderLayer(config) for in range(config.num_hidden_layers)]) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\admin\AppData\Local\Programs\Python\Python311\Lib\site-packages\transformers\models\clip\modelingclip.py", line 585, in self.layers = nn.ModuleList([CLIPEncoderLayer(config) for in range(config.num_hidden_layers)]) ^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\admin\AppData\Local\Programs\Python\Python311\Lib\site-packages\transformers\models\clip\modeling_clip.py", line 360, in init self.mlp = CLIPMLP(config) ^^^^^^^^^^^^^^^ File "C:\Users\admin\AppData\Local\Programs\Python\Python311\Lib\site-packages\transformers\models\clip\modeling_clip.py", line 345, in init self.fc2 = nn.Linear(config.intermediate_size, config.hidden_size) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\admin\AppData\Local\Programs\Python\Python311\Lib\site-packages\torch\nn\modules\linear.py", line 96, in init self.weight = Parameter(torch.empty((out_features, in_features), factory_kwargs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ RuntimeError: [enforce fail at ..\c10\core\impl\alloc_cpu.cpp:72] data. DefaultCPUAllocator: not enough memory: you tried to allocate 26214400 bytes. [Task Queue] Waiting for task queue become free, job_id=8d1f1229-1cfd-46ab-8695-4b16301302b8

aiwillcoming commented 5 months ago

但是内存是够用的 total used free
Mem: 59Gi 14Gi 30Gi

aiwillcoming commented 5 months ago

而且只要出现一次错误就会导致队列死锁、异常处理机制不够

aiwillcoming commented 5 months ago

能加qq或者微信吗, 方便联系 如果可以留下邮箱, 我发到你邮箱

konieshadow commented 5 months ago

能加qq或者微信吗, 方便联系 如果可以留下邮箱, 我发到你邮箱

可以通过我的 Github 主页找到我的邮箱,你发微信号给我,我加你。

konieshadow commented 5 months ago

抱歉,最近工作有点忙。