Open Kinsmir opened 1 year ago
The API doesn't look very reliable with the magic numbers here and there. I'm sure it will be considered when they provide a proper API that would be guaranteed not to break every now and then.
I agree, it's not optimal right now in its current form. I'm not sure how often the current API will change or break either way only time will tell.
The magic numbers are a side effect of the way this UI works with your custom workflows and I do not see this changing anytime soon. But it could be a thing to provide some fixed/hardcoded workflows initially where the magic settings/parameters are just set in stone and not just user changeable (without editing the code).
For example, let takes these 2 initially;
Maybe just thinking aloud here because I'm realizing this as I am typing this reply since this is loading a workflow/JSON file while looking at the code. Can't we just set some tokens inside input values that we could search and replace the JSON file before sending it off to the API endpoint, for example;
Will probably have ai-template support before automatic1111 and also can use XL + Refiner together properly. Former would lead to big speedup for generation.
Comfy is much more memory efficient than A1111. To the point I can run SDXL with the refiner without any issues in Comfy, but absolutely can't use it in A1111, even is I am only trying to use the base. With a 10GB VRAM GPU.
It's also a huge bonus when it comes to running SD and LLM simultaneously. The more VRAM you have, the bigger GPTQ model you can get, or the more GPU layers can be used with GGML.
True, my measly 6GB card can even do 2048x2048 SDXL in Comfy, meanwhile A1111 can't even load the model.
I took a look at the "API" Comfy provides once again, and I can't for now see an easy way to implement it, besides maybe hardcoding the whole pipeline JSON with the ability to type in the sampler and model names. For now it is "won't fix" until a proper HTTP API is provided, at least on par with something like Stable Horde.
I copy-pasted the Stable Diffusion extension and make some quick and dirty edits to make it use ComfyUI with (I think?) pretty much the full feature set.
https://github.com/LenAnderson/SillyTavern-ComfyUI
You have to export the workflow you want to use from ComfyUI and edit the JSON file to insert placeholders (e.g. "%prompt%", "%width%", etc.), and then paste the JSON text into the extension settings in SillyTavern.
And you have to launch ComfyUI with the "--enable-cors-header http://-YOUR-SILLY-TAVERN-ENDPOINT" command line argument. I'm using ComfyUI with the windows standalone thing, so I edited the run_nvidia.bat file accordingly.
Expect bugs! I just wanted to see if this would work and apparently it does.
LenAnderson I tried your thingie, did not seem to work for some reason. I'm looking at calls to sd_remote in ST extras, and I see not too much are used, I think it might be possible to write a tiny shim server, that would pretend to be a11 for st, and just translate stuff to comfy. Not sure if I have enough knowledge and will to do it. But I might :)
The calls to comfy are made directly from JS on the client side (i.e. from your browser). Check if you have any exceptions in the dev console (press F12). If you get any messages about CORS, you did not launch comfy correctly. Also check comfy's terminal output to see if the requests get there and if it complains about anything.
Also, the extension uses its own slash commands (/comfy
instead of /sd
) and it's own entry in the menu next to the chat box.
I wasn't even able to get to that, in st extensions ingerface it would show
Hmm. How did you install? It should work by simply using the import from github button in ST. It's an extension for ST, not for ST extras, so main ST host:port should be fine.
Does the console show the full path of where it tried to look for settings.html
?
You can also check your filesystem, just go to your SillyTavern directory, then public/scripts/extensions/third-party/ there should be a directory SillyTavern-ComfyUI
with index.js, manifest.json, README.md, settings.html, and style.css
@LenAnderson at this point it would be easier for you to add Comfy as an official provider for the ST SD plugin. I can bear the pipeline being packed into a textbox, as I see this particular SD backend has its share of users.
@Cohee1207 Yes, that was the idea and how I actually started this test before making a copy to avoid conflicts with ST updates. However, I wouldn't want to make a pull request with the minimal effort that I had put into this. And since I don't know when I'll get around to do this properly I thought I would at least share the results in case anyone else is interested...
@LenAnderson that how I did it, I clicked on "cloud with down arrow". Not exactly sure why it id not work though.
I personally recently made a switch over from automatic1111 webui towards ComfyUI due to better performance overall on my setup and more flexibility with the workflows they provide for generating through SDXL/Lora's and others.
While the API of ComfyUI isn't very well known or very documented yet they are offering 2 methods for API endpoints where you can load their workflow, including prompts through either a direct API call or a WS connection. see also for scripting examples here; https://github.com/comfyanonymous/ComfyUI/tree/master/script_examples
It would be nice to have in the future to have support for this feature in the future, I am aware there is probably far from the largest user base yet but it would be nice to consider this as an option in the future.