{"logger": "uvicorn.error", "timestamp": "2024-09-20T14:44:45.950538Z", "severity": "INFO", "message": "Started server process [7]"}
{"logger": "uvicorn.error", "timestamp": "2024-09-20T14:44:45.950694Z", "severity": "INFO", "message": "Waiting for application startup."}
{"logger": "uvicorn.error", "timestamp": "2024-09-20T14:44:45.954522Z", "severity": "INFO", "message": "Application startup complete."}
{"logger": "uvicorn.error", "timestamp": "2024-09-20T14:44:45.956042Z", "severity": "INFO", "message": "Uvicorn running on http://0.0.0.0:5000 (Press CTRL+C to quit)"}
Downloading updated weights manifest from https://raw.githubusercontent.com/fofr/cog-comfyui/main/weights.json
Download from https://raw.githubusercontent.com/fofr/cog-comfyui/main/weights.json timed out
⏳ Downloading mobilenet_v2-b0353104.pth to /root/.cache/torch/hub/checkpoints/
✅ mobilenet_v2-b0353104.pth downloaded to /root/.cache/torch/hub/checkpoints/ in 4.05s, size: 13.55MB
Total VRAM 6144 MB, total RAM 15999 MB
pytorch version: 2.4.1+cu121
Set vram state to: NORMAL_VRAM
Device: cuda:0 NVIDIA GeForce RTX 3060 Laptop GPU : cudaMallocAsync
Using pytorch cross attention
** User settings have been changed to be stored on the server instead of browser storage. **** For multi-user setups add the --multi-user CLI argument to enable multiple user profiles. **
[Prompt Server] web root: /src/ComfyUI/web
/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/kornia/feature/lightglue.py:44: FutureWarning: torch.cuda.amp.custom_fwd(args...) is deprecated. Please use torch.amp.custom_fwd(args..., device_type='cuda') instead.
@torch.cuda.amp.custom_fwd(cast_inputs=torch.float32)
Import times for custom nodes:
0.0 seconds: /src/ComfyUI/custom_nodes/websocket_image_save.py
Setting output directory to: /tmp/outputs
Setting input directory to: /tmp/inputs
Starting server
To see the GUI go to: http://127.0.0.1:8188
Server started in 6.01 seconds
{"logger": "cog.server.probes", "timestamp": "2024-09-20T14:45:01.506490Z", "severity": "INFO", "message": "Not running in Kubernetes: disabling probe helpers."}
{"prediction_id": null, "logger": "cog.server.runner", "timestamp": "2024-09-20T14:47:08.255819Z", "severity": "INFO", "message": "starting prediction"}
{"prediction_id": null, "logger": "cog.server.runner", "timestamp": "2024-09-20T14:47:08.256301Z", "severity": "INFO", "message": "started prediction"}
Checking inputs
Checking weights
⏳ Downloading majicmixRealistic_v7.safetensors to ComfyUI/models/checkpoints
✅ majicmixRealistic_v7.safetensors downloaded to ComfyUI/models/checkpoints in 204.79s, size: 2033.83MB
{"logger": "uvicorn.error", "timestamp": "2024-09-20T14:44:45.950538Z", "severity": "INFO", "message": "Started server process [7]"} {"logger": "uvicorn.error", "timestamp": "2024-09-20T14:44:45.950694Z", "severity": "INFO", "message": "Waiting for application startup."} {"logger": "uvicorn.error", "timestamp": "2024-09-20T14:44:45.954522Z", "severity": "INFO", "message": "Application startup complete."} {"logger": "uvicorn.error", "timestamp": "2024-09-20T14:44:45.956042Z", "severity": "INFO", "message": "Uvicorn running on http://0.0.0.0:5000 (Press CTRL+C to quit)"} Downloading updated weights manifest from https://raw.githubusercontent.com/fofr/cog-comfyui/main/weights.json Download from https://raw.githubusercontent.com/fofr/cog-comfyui/main/weights.json timed out ⏳ Downloading mobilenet_v2-b0353104.pth to /root/.cache/torch/hub/checkpoints/ ✅ mobilenet_v2-b0353104.pth downloaded to /root/.cache/torch/hub/checkpoints/ in 4.05s, size: 13.55MB Total VRAM 6144 MB, total RAM 15999 MB pytorch version: 2.4.1+cu121 Set vram state to: NORMAL_VRAM Device: cuda:0 NVIDIA GeForce RTX 3060 Laptop GPU : cudaMallocAsync Using pytorch cross attention ** User settings have been changed to be stored on the server instead of browser storage. ** ** For multi-user setups add the --multi-user CLI argument to enable multiple user profiles. ** [Prompt Server] web root: /src/ComfyUI/web /root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/kornia/feature/lightglue.py:44: FutureWarning:
torch.cuda.amp.custom_fwd(args...)
is deprecated. Please usetorch.amp.custom_fwd(args..., device_type='cuda')
instead. @torch.cuda.amp.custom_fwd(cast_inputs=torch.float32) Import times for custom nodes: 0.0 seconds: /src/ComfyUI/custom_nodes/websocket_image_save.py Setting output directory to: /tmp/outputs Setting input directory to: /tmp/inputs Starting server To see the GUI go to: http://127.0.0.1:8188 Server started in 6.01 seconds {"logger": "cog.server.probes", "timestamp": "2024-09-20T14:45:01.506490Z", "severity": "INFO", "message": "Not running in Kubernetes: disabling probe helpers."} {"prediction_id": null, "logger": "cog.server.runner", "timestamp": "2024-09-20T14:47:08.255819Z", "severity": "INFO", "message": "starting prediction"} {"prediction_id": null, "logger": "cog.server.runner", "timestamp": "2024-09-20T14:47:08.256301Z", "severity": "INFO", "message": "started prediction"} Checking inputsChecking weights ⏳ Downloading majicmixRealistic_v7.safetensors to ComfyUI/models/checkpoints ✅ majicmixRealistic_v7.safetensors downloaded to ComfyUI/models/checkpoints in 204.79s, size: 2033.83MB
Randomising seed to 2561272043 Running workflow Executing node 4, title: Load Checkpoint, class type: CheckpointLoaderSimple Executing node 6, title: CLIP Text Encode (Prompt), class type: CLIPTextEncode Executing node 7, title: CLIP Text Encode (Prompt), class type: CLIPTextEncode Executing node 5, title: Empty Latent Image, class type: EmptyLatentImage Executing node 3, title: KSampler, class type: KSampler Traceback (most recent call last): File "/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/cog/server/worker.py", line 352, in _predict result = predict(payload) File "/src/predict.py", line 115, in predict self.comfyUI.run_workflow(wf) File "/src/comfyui.py", line 250, in run_workflow self.wait_for_prompt_completion(workflow, prompt_id) File "/src/comfyui.py", line 192, in wait_for_prompt_completion raise Exception( Exception: There was an error executing your workflow: { "type": "execution_error", "data": { "prompt_id": "0515507c-4ab5-411b-9b42-6b3a4032a31a", "node_id": "3", "node_type": "KSampler", "executed": [ "4", "7", "5", "6" ], "exception_message": "[Errno 32] Broken pipe", "exception_type": "BrokenPipeError", "traceback": [ " File \"/src/ComfyUI/execution.py\", line 317, in execute\n output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)\n", " File \"/src/ComfyUI/execution.py\", line 192, in get_output_data\n return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)\n", " File \"/src/ComfyUI/execution.py\", line 169, in _map_node_over_list\n process_inputs(input_dict, i)\n", " File \"/src/ComfyUI/execution.py\", line 158, in process_inputs\n results.append(getattr(obj, func)(inputs))\n", " File \"/src/ComfyUI/nodes.py\", line 1429, in sample\n return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)\n", " File \"/src/ComfyUI/nodes.py\", line 1396, in common_ksampler\n samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,\n", " File \"/src/ComfyUI/comfy/sample.py\", line 43, in sample\n samples = sampler.sample(noise, positive, negative, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)\n", " File \"/src/ComfyUI/comfy/samplers.py\", line 829, in sample\n return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)\n", " File \"/src/ComfyUI/comfy/samplers.py\", line 729, in sample\n return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)\n", " File \"/src/ComfyUI/comfy/samplers.py\", line 716, in sample\n output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)\n", " File \"/src/ComfyUI/comfy/samplers.py\", line 695, in inner_sample\n samples = sampler.sample(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)\n", " File \"/src/ComfyUI/comfy/samplers.py\", line 600, in sample\n samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, self.extra_options)\n", " File \"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/torch/utils/_contextlib.py\", line 116, in decorate_context\n return func(*args, *kwargs)\n", " File \"/src/ComfyUI/comfy/k_diffusion/sampling.py\", line 133, in sample_euler\n for i in trange(len(sigmas) - 1, disable=disable):\n", " File \"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/auto.py\", line 37, in trange\n return tqdm(range(args), kwargs)\n", " File \"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/asyncio.py\", line 24, in init\n super().init(iterable, *args, kwargs)\n", " File \"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/std.py\", line 1098, in init\n self.refresh(lock_args=self.lock_args)\n", " File \"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/std.py\", line 1347, in refresh\n self.display()\n", " File \"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/std.py\", line 1495, in display\n self.sp(self.str() if msg is None else msg)\n", " File \"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/std.py\", line 459, in print_status\n fp_write('\r' + s + (' ' max(last_len[0] - len_s, 0)))\n", " File \"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/std.py\", line 452, in fp_write\n fp.write(str(s))\n", " File \"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/utils.py\", line 196, in inner\n return func(args, kwargs)\n" ], "current_inputs": { "seed": [ 2561272043 ], "steps": [ 20 ], "cfg": [ 8.0 ], "sampler_name": [ "euler" ], "scheduler": [ "normal" ], "denoise": [ 1.0 ], "model": [ "<comfy.model_patcher.ModelPatcher object at 0x7f90c054c520>" ], "positive": [ "[[tensor([[[-0.3854, 0.0130, -0.0590, ..., -0.4767, -0.2926, 0.0560],\n [ 0.3734, -0.7969, 1.7699, ..., -0.6371, 0.8131, 0.9324],\n [-1.0823, -0.4410, 0.4324, ..., 0.7239, -1.3086, 0.4494],\n ...,\n [-0.9714, 1.0991, -0.9264, ..., -0.5042, -0.5324, 0.1151],\n [-0.9631, 1.0848, -0.9526, ..., -0.5188, -0.5675, 0.1002],\n [-0.9225, 1.1443, -0.9150, ..., -0.5434, -0.5610, 0.0459]]]), {'pooled_output': tensor([[-1.0293e+00, 7.4614e-01, -1.3600e+00, 1.6161e+00, -4.1086e+00,\n -1.4178e+00, -3.1375e-01, -3.696e+00, -9.3316e-01,\n -2.1077e+00, 3.9704e-01, -9.4896e-01, 9.1429e-02, -5.5170e-01,\n 1.0698e-03, -1.3850e+00, -8.4948e-02, -1.0574e-01, -1.7478e+00,\n 1.5417e-01, 3.1209e-01, -1.0289e+00, -6.8465e-01, 9.4885e-01,\n 1.4138e+00, -6.2273e-01, 6.0985e-01, -3.3270e-01, 4.0747e-01,\n 2.1661e+00, -2.5077e-01, 2.4493e-01, -1.4754e-01, 1.9295e-01,\n 6.3712e-02, 1.1484e-01, 2.9299e-01, -1.9083e+00, -1.6262e-01,\n -1.2202e-01, 1.3490e+00, 1.9767e+00, -7.6653e-01, 9.0663e-02,\n 5.4751e-01, 3.3240e+00, -1.3668e+00, -2.2957e-01, -1.2579e-01,\n -1.7893e+00, -6.5743e-01, -9.2011e-01, 5.1757e-01, -6.9965e-01,\n -8.0048e-01, 9.9293e-01, -2.2603e+00, 4.5113e-01, 9.7701e-01,\n 3.1929e-01, -1.9472e+00, 1.5371e+00, 2.2744e-01, -3.0459e-01,\n -1.0575e+00, -1.0350e+00, 5.7773e-01, -7.3495e-01, -5.7918e-01,\n -8.8486e-02, 8.7876e-02, 8.2660e-01, -5.7659e-01, -1.8103e-01,\n 4.4154e-01, 1.6648e-01, -1.3477e+00, -1.1308e-01, -1.7470e-01,\n -1.7867e+00, 9.2642e-01, 1.2042e+00, -1.2913e+00, -6.8172e-01,\n 2.1354e-01, 1.6011e-01, 6.4441e-01]])}]]" ], "negative": [ "[[tensor([[[-0.3854, 0.0130, -0.0590, ..., -0.4767, -0.2926, 0.0560],\n [-0.1505, -1.3392, 0.6828, ..., 0.7338, 0.4815, -0.1677],\n [ 0.0434, 0.3338, 0.3313, ..., -1.2992, -0.7545, -1.6753],\n ...,\n [-1.0512, -0.4480, -0.0275, ..., 0.3413, -1.6165, 0.5699],\n [-1.0494, -0.4428, -0.0213, ..., 0.3547, -1.6402, 0.5625],\n [-1.0197, -0.4122, 0.0050, ..., 0.3715, -1.6089, 0.5256]]]), {'pooled_output': tensor([[-1.4241e+00, -7.1973e-01, 8.9806e-01, -1.0134e+00, -3.5304e-01,\n -1.7536e-01, -5.7281e-01, 7.2589e-01, -7.5669e-01, 3.2104e-01,\n -8.4923e-01, -1.0660e+00, 1.2237e+00, -5.1887e-01, 3.7011e-01,\n 7.3019e-01, -2.0977e-02, 4.8057e-01, 2.4052e-01, -9.4954e-01,\n -5.5987e-01, 1.3013e+00, -1.7462e+00, -1.9834e-02, -1.1118e+00,\n 7.3346e-01, 4.5369e-01, -1.6587e+00, 4.1079e-01, -5.0936e-01,\n -8.2279e-01, 8.7416e-01, -3.3521e-01, -2.0504e-02, -1.3938e+00,\n -1.6438e-01, -9.6183e-01, -1.4672e+00, -1.8401e+00, 1.5441e+00,\n -9.6601e-02, 3.9181e-01, 1.6421e+00, 9.2786e-01, -7.6050e3e+00,\n 5.2377e-03, -9.4400e--1.1663e+00,\n 4.9749e-01, -1.5234e+00, -1.5715e+00, -6.5099e-01, 2.6778e-01,\n -6.8227e-01, 5.9979e+00, 4.2191e-01, 2.0054e-01, -2.2477e-01,\n -1.7086e+00, -1.1343e+00, 4.5211e-01, -2.8442e-02, -1.3627e+00,\n -5.8206e-01, 7.8199e-01, -6.0781e-01, -8.3585e-01, -8.6802e-01,\n 5.1829e-01, 1.2886e+00, 4.4104e-01, 2.8482e-03, -3.8052e-01,\n -3.3389e-02, 2.5350e-01, -7.9922e-01, 1.1654e+00, -6.1139e-01,\n -8.2090e-01, 1.3606e+00, 3.6909e-02, 1.8444e+00, 1.3167e+00,\n 7.0883e-01, -9.8703e-02, -4.7202e-01, 3.0436e-01, 1.9612e+00,\n 2.3247e-01, -4.0334e-01, 4.0350e-01, -1.0154e+00, 6.5527e-01,\n 7.4262e-01, 6.0564e-01, 9.2776e-01, 3.0018e-02, -7.8818e-02,\n 6.6141e-01, 4.0207e+00, -1.1783e+00, -9.4401e-01, 1.7181e-01,\n 8.7705e-01, -6.3048e-01, -9.3650e-01, 7.3225e-02, -7.9806e-01,\n 4.2167e-01, -2.5655e-01, -5.6379e-01, 2.9869e-01, 8.7093e-01,\n 9.0651e-01, -9.0517e-01, -6.0697e-01, 2.7842e-01, -3.4184e-01,\n 9.2863e-01, 5.7720e-01, -2.8530e-02, 1.6481e-01, -8.6143e-01,\n 1.1080e+00, 2.5683e+00, -1.6048e+00, -1.0073e+00, -3.1940e-01,\n -1.5001e+00, 1.1895e+00, -2.7551e-02, 1.0383e+00, -7.2717e-01,\n 1.1006e+00, 1.5268e+00, 1.5875e+00, -8.4085e-01, 6.8027e-01,\n -3.3362e-01, 5.7915e-02, -5.0491e-02]])}]]" ], "latent_image": [ "{'samples': tensor([[[[0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n ...,\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.]],\n\n [[0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n ...,\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.]],\n\n [[0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n ...,\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.]],\n\n [[0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n ...,\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.]]]])}" ] }, "current_outputs": [ "6", "3", "4", "5", "8", "7", "9" ], "timestamp": 1726843835007 } } {"prediction_id": null, "error": "There was an error executing your workflow:\n\n{\n \"type\": \"execution_error\",\n \"data\": {\n \"prompt_id\": \"0515507c-4ab5-411b-9b42-6b3a4032a31a\",\n \"node_id\": \"3\",\n \"node_type\": \"KSampler\",\n \"executed\": [\n \"4\",\n \"7\",\n \"5\",\n \"6\"\n ],\n \"exception_message\": \"[Errno 32] Broken pipe\",\n \"exception_type\": \"BrokenPipeError\",\n \"traceback\": [\n \" File \\"/src/ComfyUI/execution.py\\", line 317, in execute\n output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)\n\",\n \" File \\"/src/ComfyUI/execution.py\\", line 192, in get_output_data\n return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)\n\",\n \" File \\"/src/ComfyUI/execution.py\\", line 169, in _map_node_over_list\n process_inputs(input_dict, i)\n\",\n \" File \\"/src/ComfyUI/execution.py\\", line 158, in process_inputs\n results.append(getattr(obj, func)(inputs))\n\",\n \" File \\"/src/ComfyUI/nodes.py\\", line 1429, in sample\n return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)\n\",\n \" File \\"/src/ComfyUI/nodes.py\\", line 1396, in common_ksampler\n samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,\n\",\n \" File \\"/src/ComfyUI/comfy/sample.py\\", line 43, in sample\n samples = sampler.sample(noise, positive, negative, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)\n\",\n \" File \\"/src/ComfyUI/comfy/samplers.py\\", line 829, in sample\n return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)\n\",\n \" File \\"/src/ComfyUI/comfy/samplers.py\\", line 729, in sample\n return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)\n\",\n \" File \\"/src/ComfyUI/comfy/samplers.py\\", line 716, in sample\n output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)\n\",\n \" File \\"/src/ComfyUI/comfy/samplers.py\\", line 695, in inner_sample\n samples = sampler.sample(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)\n\",\n \" File \\"/src/ComfyUI/comfy/samplers.py\\", line 600, in sample\n samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, self.extra_options)\n\",\n \" File \\"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/torch/utils/_contextlib.py\\", line 116, in decorate_context\n return func(*args, kwargs)\n\",\n \" File \\"/src/ComfyUI/comfy/k_diffusion/sampling.py\\", line 133, in sample_euler\n for i in trange(len(sigmas) - 1, disable=disable):\n\",\n \" File \\"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/auto.py\\", line 37, in trange\n return tqdm(range(*args), *kwargs)\n\",\n \" File \\"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/asyncio.py\\", line 24, in init\n super().init(iterable, args, kwargs)\n\",\n \" File \\"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/std.py\\", line 1098, in init\n self.refresh(lock_args=self.lock_args)\n\",\n \" File \\"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/std.py\\", line 1347, in refresh\n self.display()\n\",\n \" File \\"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/std.py\\", line 1495, in display\n self.sp(self.str() if msg is None else msg)\n\",\n \" File \\"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/std.py\\", line 459, in print_status\n fp_write('\\r' + s + (' ' max(last_len[0] - len_s, 0)))\n\",\n \" File \\"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/std.py\\", line 452, in fp_write\n fp.write(str(s))\n\",\n \" File \\"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/utils.py\\", line 196, in inner\n return func(args, **kwargs)\n\"\n ],\n \"current_inputs\": {\n \"seed\": [\n 2561272043\n ],\n \"steps\": [\n 20\n ],\n \"cfg\": [\n 8.0\n ],\n \"sampler_name\": [\n \"euler\"\n ],\n \"scheduler\": [\n \"normal\"\n ],\n \"denoise\": [\n 1.0\n ],\n \"model\": [\n \"<comfy.model_patcher.ModelPatcher object at 0x7f90c054c520>\"\n ],\n \"positive\": [\n \"[[tensor([[[-0.3854, 0.0130, -0.0590, ..., -0.4767, -0.2926, 0.0560],\n [ 0.3734, -0.7969, 1.7699, ..., -0.6371, 0.8131, 0.9324],\n [-1.0823, -0.4410, 0.4324, ..., 0.7239, -1.3086, 0.4494],\n ...,\n [-0.9714, 1.0991, -0.9264, ..., -0.5042, -0.5324, 0.1151],\n [-0.9631, 1.0848, -0.9526, ..., -0.5188, -0.5675, 0.1002],\n [-0.9225, 1.1443, -0.9150, ..., -0.5434, -0.5610, 0.0459]]]), {'pooled_output': tensor([[-1.0293e+00, 7.4614e-01, -1.3600e+00, 1.6161e+00, -4.1086e+00,\n -1.4178e+00, -3.1375e-01, -3.6280e-01, -1.2294e+00, 9.1022e-01,\n -6.1229e-02, 7.7592e-01, 7.3969e-02, 1.2458e+00, 4.8509e-01,\n 3.5944e-01, -6.6526e-01, 5.2198e-01, 1.1764e+00, -1.1787e+00,\n -1.9265e-01, -2.4779e-01, 4.0922e-01, 1.3522e-01, -5.1484e-01,\n 6.8885e-01, -, 1.6274e-01, -1.3117e+00,\n 3.1130e-01, -6.0447e-01, 4.2245e-02, -2.2521e-01, -1.8815e+00,\n 4.0495e-01, -2.0729e-01, 2.7738e-01, -6.5386e-01, -6.9539e-01,\n 6.1541e-01, -6.9837e-01, 9.1980e-02, 3.0744e-01, -2.1665e+00,\n 6.5433e-01, 7.4050e-, ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.]],\n\n [[0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n ...,\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.]],\n\n [[0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n ...,\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.]],\n\n [[0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n ...,\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.]]]])}\"\n ]\n },\n \"current_outputs\": [\n \"6\",\n \"3\",\n \"4\",\n \"5\",\n \"8\",\n \"7\",\n \"9\"\n ],\n \"timestamp\": 1726843835007\n }\n}", "logger": "cog.server.runner", "timestamp": "2024-09-20T14:50:35.154865Z", "severity": "INFO", "message": "prediction failed"} {"prediction_id": null, "logger": "cog.server.runner", "timestamp": "2024-09-20T14:54:18.814184Z", "severity": "INFO", "message": "starting prediction"} {"prediction_id": null, "logger": "cog.server.runner", "timestamp": "2024-09-20T14:54:18.814646Z", "severity": "INFO", "message": "started prediction"} Checking inputs
Checking weights ✅ majicmixRealistic_v7.safetensors exists in ComfyUI/models/checkpoints
Randomising seed to 3015642287 Running workflow Executing node 3, title: KSampler, class type: KSampler Traceback (most recent call last): File "/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/cog/server/worker.py", line 352, in _predict result = predict(payload) File "/src/predict.py", line 115, in predict self.comfyUI.run_workflow(wf) File "/src/comfyui.py", line 250, in run_workflow self.wait_for_prompt_completion(workflow, prompt_id) File "/src/comfyui.py", line 192, in wait_for_prompt_completion raise Exception( Exception: There was an error executing your workflow: { "type": "execution_error", "data": { "prompt_id": "795d09ed-3c98-4ea7-8a18-71d802672d38", "node_id": "3", "node_type": "KSampler", "executed": [], "exception_message": "[Errno 32] Broken pipe", "exception_type": "BrokenPipeError", "traceback": [ " File \"/src/ComfyUI/execution.py\", line 317, in execute\n output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)\n", " File \"/src/ComfyUI/execution.py\", line 192, in get_output_data\n return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)\n", " File \"/src/ComfyUI/execution.py\", line 169, in _map_node_over_list\n process_inputs(input_dict, i)\n", " File \"/src/ComfyUI/execution.py\", line 158, in process_inputs\n results.append(getattr(obj, func)(inputs))\n", " File \"/src/ComfyUI/nodes.py\", line 1429, in sample\n return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)\n", " File \"/src/ComfyUI/nodes.py\", line 1396, in common_ksampler\n samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,\n", " File \"/src/ComfyUI/comfy/sample.py\", line 43, in sample\n samples = sampler.sample(noise, positive, negative, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)\n", " File \"/src/ComfyUI/comfy/samplers.py\", line 829, in sample\n return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)\n", " File \"/src/ComfyUI/comfy/samplers.py\", line 729, in sample\n return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)\n", " File \"/src/ComfyUI/comfy/samplers.py\", line 716, in sample\n output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)\n", " File \"/src/ComfyUI/comfy/samplers.py\", line 695, in inner_sample\n samples = sampler.sample(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)\n", " File \"/src/ComfyUI/comfy/samplers.py\", line 600, in sample\n samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, self.extra_options)\n", " File \"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/torch/utils/_contextlib.py\", line 116, in decorate_context\n return func(*args, *kwargs)\n", " File \"/src/ComfyUI/comfy/k_diffusion/sampling.py\", line 133, in sample_euler\n for i in trange(len(sigmas) - 1, disable=disable):\n", " File \"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/auto.py\", line 37, in trange\n return tqdm(range(args), kwargs)\n", " File \"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/asyncio.py\", line 24, in init\n super().init(iterable, *args, kwargs)\n", " File \"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/std.py\", line 1098, in init\n self.refresh(lock_args=self.lock_args)\n", " File \"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/std.py\", line 1347, in refresh\n self.display()\n", " File \"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/std.py\", line 1495, in display\n self.sp(self.str() if msg is None else msg)\n", " File \"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/std.py\", line 459, in print_status\n fp_write('\r' + s + (' ' max(last_len[0] - len_s, 0)))\n", " File \"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/std.py\", line 452, in fp_write\n fp.write(str(s))\n", " File \"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/utils.py\", line 196, in inner\n return func(args, kwargs)\n" ], "current_inputs": { "seed": [ 3015642287 ], "steps": [ 20 ], "cfg": [ 8.0 ], "sampler_name": [ "euler" ], "scheduler": [ "normal" ], "denoise": [ 1.0 ], "model": [ "<comfy.model_patcher.ModelPatcher object at 0x7f90c054c520>" ], "positive": [ "[[tensor([[[-0.3854, 0.0130, -0.0590, ..., -0.4767, -0.2926, 0.0560],\n [ 0.3734, -0.7969, 1.7699, ..., -0.6371, 0.8131, 0.9324],\n [-1.0823, -0.4410, 0.4324, ..., 0.7239, -1.3086, 0.4494],\n ...,\n [-0.9714, 1.0991, -0.9264, ..., -0.5042, -0.5324, 0.1151],\n [-0.9631, 1.0848, -0.9526, ..., -0.5188, -0.5675, 0.1002],\n [-0.9225, 1.1443, -0.9150, ..., -0.5434, -0.5610, 0.0459]]]), {'pooled_output': tensor([[-1.0293e+00, 7.4614e-01, -1.3600e+00, 1.6161e+00, -4.1086e+00,\n -1.4178e+00, -3.1375e-01, -3.6280e-01, -1.2294e+00, 9.1022e-01,\n -6.1229e-02, 7.7592e-01, 7.3969e-02, 1.2458e+00, 4.8509e-01,\n 3.5944e-01, 5.4751e-01, 3.3240e+00, -1.3668e+00, -2.2957e-01, -1.2579e-01,\n -1.7893e+00, -6.5743e-01, -9.2011e-01, 5.1757e-01, -6.9965e-01,\n -8.0048e-01, 9.9293e-01, -2.2603e+00, 4.5113e-01, 9.7701e-01,\n 3.1929e-01, -1.9472e+00, 1.5371e+00, 2.2744e-01, -3.0459e-01,\n -1.0575e+00, -1.0350e+00, 5.7773e-01, -7.3495e-01, -5.7918e-01,\n -8.8486e-02, 8.7876e-02, 8.2660e-01, -5.7659e-01, -1.8103e-01,\n 4.4154e-01, 1.6648e-01, -1.3477e+00, -1.1308e-01, -1.7470e-01,\n -1.7867e+00, 9.2642e-01, 1.2042e+00, -1.2913e+00, -6.8172e-01,\n 2.1354e-01, 1.6011e-01, 6.4441e-01]])}]]" ], "negative": [ "[[tensor([[[-0.3854, 0.0130, -0.0590, ..., -0.4767, -0.2926, 0.0560],\n [-0.1505, -1.3392, 0.6828, ..., 0.7338, 0.4815, -0.1677],\n [ 0.0434, 0.3338, 0.3313, ..., -1.2992, -0.7545, -1.6753],\n ...,\n [-1.0512, -0.4480, -0.0275, ..., 0.3413, -1.6165, 0.5699],\n [-1.0494, -0.4428, -0.0213, ..., 0.3547, -1.6402, 0.5625],\n [-1.0197, -0.4122, 0.0050, ..., 0.3715, -1.6089, 0.5256]]]), {'pooled_output': tensor([[-1.4241e+00, -7.1973e-01, 8.9806e-01, -1.0134e+00, -3.5304e-01,\n -1.7536e-01, -5.7281e-01, 7.2589e-01, -7.5669e-01, 3.2104e-01,\n -8.4923e-01, -1.0660e+00, 1.2237e+00, -5.1887e-01, 3.7011e-01,\n 7.3019e-01, -2.0977e-02, 4.8057e-01, 2.4052e-01, -9.4954e-01,\n -5.5987e-01, 1.3013e+00, -1.7462e+00, -1.9834e-02, -1.1118e+00,\n 1.1006e+00, 1.5268e+00, 1.5875e+00, -8.4085e-01, 6.8027e-01,\n -3.3362e-01, 5.7915e-02, -5.0491e-02]])}]]" ], "latent_image": [ "{'samples': tensor([[[[0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n ...,\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.]],\n\n [[0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n ...,\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.]],\n\n [[0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n ...,\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.]],\n\n [[0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n ...,\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.],\n [0., 0., 0., ..., 0., 0., 0.]]]])}" ] }, "current_outputs": [ "6", "3", "4", "5", "8", "7", "9" ], "timestamp": 1726844058904 } } {"prediction_id": null, "error": "There was an error executing your workflow:\n\n{\n \"type\": \"execution_error\",\n \"data\": {\n \"prompt_id\": \"795d09ed-3c98-4ea7-8a18-71d802672d38\",\n \"node_id\": \"3\",\n \"node_type\": \"KSampler\",\n \"executed\": [],\n \"exception_message\": \"[Errno 32] Broken pipe\",\n \"exception_type\": \"BrokenPipeError\",\n \"traceback\": [\n \" File \\"/src/ComfyUI/execution.py\\", line 317, in execute\n output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)\n\",\n \" File \\"/src/ComfyUI/execution.py\\", line 192, in get_output_data\n return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb)\n\",\n \" File \\"/src/ComfyUI/execution.py\\", line 169, in _map_node_over_list\n process_inputs(input_dict, i)\n\",\n \" File \\"/src/ComfyUI/execution.py\\", line 158, in process_inputs\n results.append(getattr(obj, func)(inputs))\n\",\n \" File \\"/src/ComfyUI/nodes.py\\", line 1429, in sample\n return common_ksampler(model, seed, steps, cfg, sampler_name, scheduler, positive, negative, latent_image, denoise=denoise)\n\",\n \" File \\"/src/ComfyUI/nodes.py\\", line 1396, in common_ksampler\n samples = comfy.sample.sample(model, noise, steps, cfg, sampler_name, scheduler, positive, negative, latent_image,\n\",\n \" File \\"/src/ComfyUI/comfy/sample.py\\", line 43, in sample\n samples = sampler.sample(noise, positive, negative, cfg=cfg, latent_image=latent_image, start_step=start_step, last_step=last_step, force_full_denoise=force_full_denoise, denoise_mask=noise_mask, sigmas=sigmas, callback=callback, disable_pbar=disable_pbar, seed=seed)\n\",\n \" File \\"/src/ComfyUI/comfy/samplers.py\\", line 829, in sample\n return sample(self.model, noise, positive, negative, cfg, self.device, sampler, sigmas, self.model_options, latent_image=latent_image, denoise_mask=denoise_mask, callback=callback, disable_pbar=disable_pbar, seed=seed)\n\",\n \" File \\"/src/ComfyUI/comfy/samplers.py\\", line 729, in sample\n return cfg_guider.sample(noise, latent_image, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)\n\",\n \" File \\"/src/ComfyUI/comfy/samplers.py\\", line 716, in sample\n output = self.inner_sample(noise, latent_image, device, sampler, sigmas, denoise_mask, callback, disable_pbar, seed)\n\",\n \" File \\"/src/ComfyUI/comfy/samplers.py\\", line 695, in inner_sample\n samples = sampler.sample(self, sigmas, extra_args, callback, noise, latent_image, denoise_mask, disable_pbar)\n\",\n \" File \\"/src/ComfyUI/comfy/samplers.py\\", line 600, in sample\n samples = self.sampler_function(model_k, noise, sigmas, extra_args=extra_args, callback=k_callback, disable=disable_pbar, self.extra_options)\n\",\n \" File \\"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/torch/utils/_contextlib.py\\", line 116, in decorate_context\n return func(*args, kwargs)\n\",\n \" File \\"/src/ComfyUI/comfy/k_diffusion/sampling.py\\", line 133, in sample_euler\n for i in trange(len(sigmas) - 1, disable=disable):\n\",\n \" File \\"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/auto.py\\", line 37, in trange\n return tqdm(range(*args), *kwargs)\n\",\n \" File \\"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/asyncio.py\\", line 24, in init\n super().init(iterable, args, kwargs)\n\",\n \" File \\"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/std.py\\", line 1098, in init\n self.refresh(lock_args=self.lock_args)\n\",\n \" File \\"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/std.py\\", line 1347, in refresh\n self.display()\n\",\n \" File \\"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/std.py\\", line 1495, in display\n self.sp(self.str() if msg is None else msg)\n\",\n \" File \\"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/std.py\\", line 459, in print_status\n fp_write('\\r' + s + (' ' max(last_len[0] - len_s, 0)))\n\",\n \" File \\"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/std.py\\", line 452, in fp_write\n fp.write(str(s))\n\",\n \" File \\"/root/.pyenv/versions/3.10.15/lib/python3.10/site-packages/tqdm/utils.py\\", line 196, in inner\n return func(args, **kwargs)\n\"\n ],\n \"current_inputs\": {\n \"seed\": [\n 3015642287\n ],\n \"steps\": [\n 20\n ],\n \"cfg\": [\n 8.0\n ],\n \"sampler_name\": [\n \"euler\"\n ],\n \"scheduler\": [\n \"normal\"\n ],\n \"denoise\": [\n 1.0\n ],\n \"model\": [\n \"<comfy.model_patcher.ModelPatcher object at 0x7f90c054c520>\"\n ],\n \"positive\": [\n \"[[tensor([[[-0.3854, 0.0130, -0.0590, ..., -0.4767, -0.2926, 0.0560],\n [ 0.3734, -0.7969, 1.7699, ..., -0.6371, 0.8131, 0.9324],\n [-1.0823, -0.4410, 0.4324, ..., 0.7239, -1.3086, 0.4494],\n ...,\n [-0.9714, 1.0991, -0.9264, ..., -0.5042, -0.5324, 0.1151],\n [-0.9631, 1.0848, -0.9526, ..., -0.5188, -0.5675, 0.1002],\n [-0.9225, 1.1443, -0.9150, ..., -0.5434, -0.5610, 0.0459]]]), {'pooled_output': tensor([[-1.0293e+00, 7.4614e-01, -1.3600e+00, 1.6161e+00, -4.1086e+00,\n -1.4178e+00, -3.1375e-01, -3.6280e-01, -1.2294e+00, 9.1022e-01,\n -6.1229e-02, 7.7592e-01, 7.3969e-02, 1.2458e+00, 4.8509e-01,\n 3.5944e-01, -6.6526e-01, 5.2198e-01, 1.1764e+00, -1.1787e+00,\n -1.9265e-01, -2.4779e-01, 4.0922e-01, 1.3522e-01, -5.1484e-01,\n 6.8885e-01, -8.8858e-01, -7.0632e-02, -9.5302e-01, -1.1487e+00,\n 1.4423e+00, -1.0700e+00, 7.3726e-03, -3.7768e-01, 2.2164e+00,\n -7.4001e-01, -4.6616e-01, -1.3759e+00, -2.4998e-01, 1.2504e+00,\n 6.6931e-01, -5.6468e-01, -8.2944e-01, -1.0866e+00, -1.2241e+00,\n -1.2160e-02, 1.2915e+00, 2.2912e-02, 2.2809e+00, -9.0426e-01,\n -3.6792e-01, -9.0332e-01, 3.2377e-01, 7.4007e-01, -1.0629e+00,\n 8.1237e-02, -1.1905e+00, -8.4613e-01, -1.7439e+00, -5.7087e-01,\n -8.4887e-01, 2.2518e-01, -1.1502e+00, 9.0996e-02, 1.9129e-01,\n -1.5906e+00, -1.4304e+00, -2.7873e-01, 2.2951e-01, -2.3247e+00,\n -6.4006e-01, 4.6207e-01, 1.9393e-01, 8.4362e-01, -5.2884e-01,\n 5.0141e-01, -6.5805e-01, -3.0062e-01, 1.2275e+00, 5.3951e-01,\n ],\n \"timestamp\": 1726844058904\n }\n}", "logger": "cog.server.runner", "timestamp": "2024-09-20T14:54:19.049698Z", "severity": "INFO", "message": "prediction failed"}