bash-j / mikey_nodes

comfy nodes from mikey
MIT License
109 stars 15 forks source link

I try to implement ooba prompt bat got an error #17

Open ShmuelRonen opened 10 months ago

ShmuelRonen commented 10 months ago

I start oobabooga API with: python server.py --model TheBloke_Starling-LM-7B-alpha-GPTQ --loader ExLlamav2_HF --listen --api

Everything looks like working but I got:

ooba

this is comfyui log:

Prompt executed in 2.31 seconds got prompt Prompt executed in 2.04 seconds got prompt Prompt executed in 2.05 seconds got prompt Prompt executed in 2.06 seconds got prompt Prompt executed in 2.05 seconds got prompt Prompt executed in 2.04 seconds got prompt got prompt Prompt executed in 2.04 seconds Prompt executed in 2.08 seconds

bash-j commented 10 months ago

Hi ShmuelRonen, I created the oobaprompt node before they changed their name to text-generation-webui and also changed their API. I have yet to get around to making a new node for their new API, but I did make a node for LM Studio's API called LMStudioPrompt.

ShmuelRonen commented 10 months ago

I

Hi ShmuelRonen, I created the oobaprompt node before they changed their name to text-generation-webui and also changed their API. I have yet to get around to making a new node for their new API, but I did make a node for LM Studio's API called LMStudioPrompt.

Thanks bash-j for the quick response. I have oobabooga installed with lots of LLM's gigs. I would prefer to avoid installing LM-Studio mainly due to the fact that all LLM's need to be reinstalled. I would appreciate it if you could reuse oobaprompt.

nikolaiusa commented 9 months ago

+1