stavsap / comfyui-ollama

Apache License 2.0
372 stars 34 forks source link

Batch cue words.ComfyUI Error Report, KeyError:'context',request help #58

Open lopezguan opened 2 weeks ago

lopezguan commented 2 weeks ago

ComfyUI Error Report

Error Details


## System Information
- **ComfyUI Version:** v0.2.4-16-g30c0c81
- **Arguments:** E:\ComfyUI-aki-v1.4\main.py --auto-launch --preview-method auto --disable-cuda-malloc
- **OS:** nt
- **Python Version:** 3.10.11 (tags/v3.10.11:7d4cc5a, Apr  5 2023, 00:38:17) [MSC v.1929 64 bit (AMD64)]
- **Embedded Python:** false
- **PyTorch Version:** 2.3.1+cu121
## Devices

- **Name:** cuda:0 NVIDIA GeForce RTX 3080 Ti : cudaMallocAsync
  - **Type:** cuda
  - **VRAM Total:** 12884377600
  - **VRAM Free:** 11609833472
  - **Torch VRAM Total:** 0
  - **Torch VRAM Free:** 0
![image](https://github.com/user-attachments/assets/310f65de-7559-4e0c-9a25-cdbf0b74e1a3)
stavsap commented 17 hours ago

what version of ollama are you using? this was a bug in some versions that they omitted context from the response.