Open MiladZarour opened 4 months ago
I am wondering when we press on Install Models, where the information is fetched from ? I mean are we fetching the models from www.kaggle.com , or www.civitai.com www.huggingface.co ?
something like this
I implement a code that show the missing Models (as the first step) but the thing is this may never works due to that "comfyui" by it's self directly take some default models (if they are installed in checkpoints...etc) not sure where to find this functionality in confyui and disable it for this case , I may check that first in comfyui and try to get the missing models without loading the default models
here is the code so far if anyone would like to try :
in comfui-manager.js add this code in line 726 or 725
$el("button.cm-button", {
type: "button",
textContent: "Install Missing Models",
onclick:
() => {
if(!ModelInstaller.instance)
ModelInstaller.instance = new ModelInstaller(app, self);
ModelInstaller.instance.show();
}
}),
in model-downloader.js :
in class ModelInstaller extends ComfyDialog, added this :
static ShowMode = {
NORMAL: 0,
MISSING_NODES: 1,
UPDATE: 2,
};
and in the end of the code , added this function :
async showMissingModels() {
try {
this.clear();
const allModels = (await getModelList()).models;
const missingModels = allModels.filter(model => model.installed !== 'True');
this.data = { models: missingModels };
while (this.element.children.length) {
this.element.removeChild(this.element.children[0]);
}
await this.createHeaderControls();
if(this.search_keyword) {
this.search_box.value = this.search_keyword;
}
await this.createGrid();
await this.createBottomControls();
this.apply_searchbox(this.data);
this.element.style.display = "block";
this.element.style.zIndex = 10001;
}
catch(exception) {
app.ui.dialog.show(`Failed to get missing model list. / ${exception}`);
}
}
after that we can start developing the functionality for ipadapters
If you would like to contribute, please make a PR.
And for information about the models, please check the model-list.json
.
Thanks @ltdrdata I will check that But can I make Pull request already now? I know the code now is not working as it should be , I am still figuring out how it can be done
Thanks @ltdrdata I will check that But can I make Pull request already now? I know the code now is not working as it should be , I am still figuring out how it can be done
Of course, do it when you are ready.
Downloading missing models is a painful job.
Load
node, but some forget to honor the proxy environment variable and try to connect to unreachable sites (for example, WD14 Tagger).On the other hand, I really like the installation process of a node: the Manager just clones the folder and runs pip install
for the requirements. Clean and nice.
I was thinking, is it possible to come up with a file format like required_models.json
(filename to be discussed), and encourage new nodes to produce this file in their project directory, so the Manager can download the files for the users? Because of the popularity of the Manager, I think it is possible to set this standard, and the authors of nodes will follow it since they don't need to write downloading code anymore.
Is it a valid idea? @ltdrdata
As a proof-of-concept, ChatGPT just gives me a suggested format:
filename: models.yaml
models:
- name: Model1
url: https://example.com/model1.pth
destination: models/model1.pth
checksum: abcdef1234567890
- name: Model2
url: https://example.com/model2.onnx
destination: models/model2.onnx
checksum: 123456abcdef7890
- name: Model3
url: https://example.com/model3.h5
destination: models/model3.h5
checksum: 7890abcdef123456
I think checksum
should be optional.
Now, here is an example models.yaml
for the IPAdapter Plus node according to its README.md (They do need a lot of models.)
models:
- name: CLIP-ViT-H-14-laion2B-s32B-b79K
url: https://huggingface.co/h94/IP-Adapter/resolve/main/models/image_encoder/model.safetensors
destination: /ComfyUI/models/clip_vision/CLIP-ViT-H-14-laion2B-s32B-b79K.safetensors
- name: CLIP-ViT-bigG-14-laion2B-39B-b160k
url: https://huggingface.co/h94/IP-Adapter/resolve/main/sdxl_models/image_encoder/model.safetensors
destination: /ComfyUI/models/clip_vision/CLIP-ViT-bigG-14-laion2B-39B-b160k.safetensors
- name: ip-adapter_sd15
url: https://huggingface.co/h94/IP-Adapter/resolve/main/models/ip-adapter_sd15.safetensors
destination: /ComfyUI/models/ipadapter/ip-adapter_sd15.safetensors
- name: ip-adapter_sd15_light_v11
url: https://huggingface.co/h94/IP-Adapter/resolve/main/models/ip-adapter_sd15_light_v11.bin
destination: /ComfyUI/models/ipadapter/ip-adapter_sd15_light_v11.bin
- name: ip-adapter-plus_sd15
url: https://huggingface.co/h94/IP-Adapter/resolve/main/models/ip-adapter-plus_sd15.safetensors
destination: /ComfyUI/models/ipadapter/ip-adapter-plus_sd15.safetensors
- name: ip-adapter-plus-face_sd15
url: https://huggingface.co/h94/IP-Adapter/resolve/main/models/ip-adapter-plus-face_sd15.safetensors
destination: /ComfyUI/models/ipadapter/ip-adapter-plus-face_sd15.safetensors
- name: ip-adapter-full-face_sd15
url: https://huggingface.co/h94/IP-Adapter/resolve/main/models/ip-adapter-full-face_sd15.safetensors
destination: /ComfyUI/models/ipadapter/ip-adapter-full-face_sd15.safetensors
- name: ip-adapter_sd15_vit-G
url: https://huggingface.co/h94/IP-Adapter/resolve/main/models/ip-adapter_sd15_vit-G.safetensors
destination: /ComfyUI/models/ipadapter/ip-adapter_sd15_vit-G.safetensors
- name: ip-adapter_sdxl_vit-h
url: https://huggingface.co/h94/IP-Adapter/resolve/main/sdxl_models/ip-adapter_sdxl_vit-h.safetensors
destination: /ComfyUI/models/ipadapter/ip-adapter_sdxl_vit-h.safetensors
- name: ip-adapter-plus_sdxl_vit-h
url: https://huggingface.co/h94/IP-Adapter/resolve/main/sdxl_models/ip-adapter-plus_sdxl_vit-h.safetensors
destination: /ComfyUI/models/ipadapter/ip-adapter-plus_sdxl_vit-h.safetensors
- name: ip-adapter-plus-face_sdxl_vit-h
url: https://huggingface.co/h94/IP-Adapter/resolve/main/sdxl_models/ip-adapter-plus-face_sdxl_vit-h.safetensors
destination: /ComfyUI/models/ipadapter/ip-adapter-plus-face_sdxl_vit-h.safetensors
- name: ip-adapter_sdxl
url: https://huggingface.co/h94/IP-Adapter/resolve/main/sdxl_models/ip-adapter_sdxl.safetensors
destination: /ComfyUI/models/ipadapter/ip-adapter_sdxl.safetensors
- name: ip-adapter-faceid_sd15
url: https://huggingface.co/h94/IP-Adapter-FaceID/resolve/main/ip-adapter-faceid_sd15.bin
destination: /ComfyUI/models/ipadapter/ip-adapter-faceid_sd15.bin
- name: ip-adapter-faceid-plusv2_sd15
url: https://huggingface.co/h94/IP-Adapter-FaceID/resolve/main/ip-adapter-faceid-plusv2_sd15.bin
destination: /ComfyUI/models/ipadapter/ip-adapter-faceid-plusv2_sd15.bin
- name: ip-adapter-faceid-portrait-v11_sd15
url: https://huggingface.co/h94/IP-Adapter-FaceID/resolve/main/ip-adapter-faceid-portrait-v11_sd15.bin
destination: /ComfyUI/models/ipadapter/ip-adapter-faceid-portrait-v11_sd15.bin
- name: ip-adapter-faceid_sdxl
url: https://huggingface.co/h94/IP-Adapter-FaceID/resolve/main/ip-adapter-faceid_sdxl.bin
destination: /ComfyUI/models/ipadapter/ip-adapter-faceid_sdxl.bin
- name: ip-adapter-faceid-plusv2_sdxl
url: https://huggingface.co/h94/IP-Adapter-FaceID/resolve/main/ip-adapter-faceid-plusv2_sdxl.bin
destination: /ComfyUI/models/ipadapter/ip-adapter-faceid-plusv2_sdxl.bin
- name: ip-adapter-faceid-portrait_sdxl
url: https://huggingface.co/h94/IP-Adapter-FaceID/resolve/main/ip-adapter-faceid-portrait_sdxl.bin
destination: /ComfyUI/models/ipadapter/ip-adapter-faceid-portrait_sdxl.bin
- name: ip-adapter-faceid-portrait_sdxl_unnorm
url: https://huggingface.co/h94/IP-Adapter-FaceID/resolve/main/ip-adapter-faceid-portrait_sdxl_unnorm.bin
destination: /ComfyUI/models/ipadapter/ip-adapter-faceid-portrait_sdxl_unnorm.bin
- name: ip-adapter-faceid_sd15_lora
url: https://huggingface.co/h94/IP-Adapter-FaceID/resolve/main/ip-adapter-faceid_sd15_lora.safetensors
destination: /ComfyUI/models/loras/ip-adapter-faceid_sd15_lora.safetensors
- name: ip-adapter-faceid-plusv2_sd15_lora
url: https://huggingface.co/h94/IP-Adapter-FaceID/resolve/main/ip-adapter-faceid-plusv2_sd15_lora.safetensors
destination: /ComfyUI/models/loras/ip-adapter-faceid-plusv2_sd15_lora.safetensors
- name: ip-adapter-faceid_sdxl_lora
url: https://huggingface.co/h94/IP-Adapter-FaceID/resolve/main/ip-adapter-faceid_sdxl_lora.safetensors
destination: /ComfyUI/models/loras/ip-adapter-faceid_sdxl_lora.safetensors
- name: ip-adapter-faceid-plusv2_sdxl_lora
url: https://huggingface.co/h94/IP-Adapter-FaceID/resolve/main/ip-adapter-faceid-plusv2_sdxl_lora.safetensors
destination: /ComfyUI/models/loras/ip-adapter-faceid-plusv2_sdxl_lora.safetensors
- name: ip_plus_composition_sd15
url: https://huggingface.co/ostris/ip-composition-adapter/resolve/main/ip_plus_composition_sd15.safetensors
destination: /ComfyUI/models/ipadapter/ip_plus_composition_sd15.safetensors
- name: ip_plus_composition_sdxl
url: https://huggingface.co/ostris/ip-composition-adapter/resolve/main/ip_plus_composition_sdxl.safetensors
destination: /ComfyUI/models/ipadapter/ip_plus_composition_sdxl.safetensors
Downloading missing models is a painful job.
- Some custom nodes have Python code that downloads models the first time they are loaded onto the system, which can confuse users about why ComfyUI takes so long to restart. This is especially true when a firewall blocks the downloading process.
- Some nodes simply list the models in their repo's README.md, requiring users to download them manually (for example, IPAdapter Plus).
- Others are more user-friendly, only starting to download necessary models the first time you run their
Load
node, but some forget to honor the proxy environment variable and try to connect to unreachable sites (for example, WD14 Tagger).On the other hand, I really like the installation process of a node: the Manager just clones the folder and runs
pip install
for the requirements. Clean and nice.I was thinking, is it possible to come up with a file format like
required_models.json
(filename to be discussed), and encourage new nodes to produce this file in their project directory, so the Manager can download the files for the users? Because of the popularity of the Manager, I think it is possible to set this standard, and the authors of nodes will follow it since they don't need to write downloading code anymore.Is it a valid idea? @ltdrdata
The upcoming Comfy Leadership Council LA summit will include discussions on node standardization. I hope the proposal to provide specifications for the models being used will be well reflected.
If a model list is explicitly provided, the Manager and comfy-cli will support the downloading of not only nodes but also the dependent models.
The upcoming Comfy Leadership Council LA summit will include discussions on node standardization. I hope the proposal to provide specifications for the models being used will be well reflected.
Please keep me informed about the outcome of the summit. Thanks!
Yes, I was thinking about a JSON file with the necessary modules and IP adapters, as you mentioned, @liusida. That would be a perfect solution.
I was considering implementing it as follows:
but still not sure how to disable the manager from loading the default values in the modules or ip adapters...etc , I mean even this solution could be implemented in all the other folders (Loras, clips,....etc)
Yes, I was thinking about a JSON file with the necessary modules and IP adapters, as you mentioned, @liusida. That would be a perfect solution.
I was considering implementing it as follows:
- The user will loads a .json workflow.
- If something is missing (nodes, modules, IP adapters):
- A JSON file will be created with the missing nodes, modules, and IP adapters (possibly including a URL from Hugging Face).
- A pop-up message will inform the user of the nodes, modules, and IP adapters that need to be installed, with options to proceed (Yes, No).
- If the user selects Yes, the manager will attempt to download the necessary components(to their right folders).
- If the user selects No, no action will be taken.
but still not sure how to disable the manager from loading the default values in the modules or ip adapters...etc , I mean even this solution could be implemented in all the other folders (Loras, clips,....etc)
Embedding models' urls in workflow.json
might be a valid idea!
Interestly, some cool guy has embedded models' urls to their modified version of workflow.json
already. Might be a good starting point:
"files": [
[
{
"download_url": "https://comfyworkflows.com/api/comfyui-launcher/files/h/44131596/download",
"dest_relative_path": "models/ipadapter/ip-adapter-faceid_sdxl.bin",
"sha256_checksum": "f455fed24e207c878ec1e0466b34a969d37bab857c5faa4e8d259a0b4ff63d7e",
"size": 1071149741
}
],
...
and https://comfyworkflows.com/api/comfyui-launcher/files/h/44131596/download
specifies different possible places that you can download the file from, just in case one repo deletes its files.
{"urls":[
"https://huggingface.co/AlexCh4532/models_for_ControlNet/resolve/eb2e21c760555eb93b577c158ba3d81814951d77/ip-adapter-faceid_sdxl.bin",
"https://huggingface.co/phamhungd/XLControlnet/resolve/74ecd902640ebb0f6c3e621389d477d39617db3b/ip-adapter/ip-adapter-faceid_sdxl.bin",
"https://huggingface.co/JCTN/IP-Adapter-FaceID/resolve/1263ddceb4b923670ba8416349aa7b1f0e2ba476/ip-adapter-faceid_sdxl.bin",
"https://huggingface.co/phamhungd/XLControlnet/resolve/663185e727e22d694a8d58ce574bcc3fec120383/ip-adapter-faceid_sdxl.bin",
"https://huggingface.co/BarrenWardo/IPAdapters-FaceID/resolve/4f1a205f9dfc5fa1d5131945bce81ba9c5c551d6/ip-adapter-faceid_sdxl.bin",
"https://huggingface.co/h94/IP-Adapter-FaceID/resolve/43907e6f44d079bf1a9102d9a6e56aef7a219bae/ip-adapter-faceid_sdxl.bin"]}
it'll be nicer to use a github link instead of host the url list on their own server comfyworkflows.com
just like what we did with custom-node-list.json
here.
To this end, I was wondering, why not simply bulid a docker image for each workflow? at least for some popular workflows.
Downloading missing models is a painful job.
- Some custom nodes have Python code that downloads models the first time they are loaded onto the system, which can confuse users about why ComfyUI takes so long to restart. This is especially true when a firewall blocks the downloading process.
- Some nodes simply list the models in their repo's README.md, requiring users to download them manually (for example, IPAdapter Plus).
- Others are more user-friendly, only starting to download necessary models the first time you run their
Load
node, but some forget to honor the proxy environment variable and try to connect to unreachable sites (for example, WD14 Tagger).On the other hand, I really like the installation process of a node: the Manager just clones the folder and runs
pip install
for the requirements. Clean and nice. I was thinking, is it possible to come up with a file format likerequired_models.json
(filename to be discussed), and encourage new nodes to produce this file in their project directory, so the Manager can download the files for the users? Because of the popularity of the Manager, I think it is possible to set this standard, and the authors of nodes will follow it since they don't need to write downloading code anymore. Is it a valid idea? @ltdrdataThe upcoming Comfy Leadership Council LA summit will include discussions on node standardization. I hope the proposal to provide specifications for the models being used will be well reflected.
If a model list is explicitly provided, the Manager and comfy-cli will support the downloading of not only nodes but also the dependent models.
@ltdrdata Were there any decisions from the summit? Anything about the model list?
I am thinking about starting working on this issue tomorrow.
Due to time constraints, I was unable to attend offline, so I couldn't see the discussion results directly. I plan to receive the content from the collaborating attendees soon.
For your information. https://www.youtube.com/watch?v=4__I4tc49do&list=PL-qR9Oxm8A3cdzjK_zv3Kv9V2Uo6vCvuu
Due to time constraints, I was unable to attend offline, so I couldn't see the discussion results directly. I plan to receive the content from the collaborating attendees soon.
For your information. https://www.youtube.com/watch?v=4__I4tc49do&list=PL-qR9Oxm8A3cdzjK_zv3Kv9V2Uo6vCvuu
Thank you for sharing this! Really appreciate it! Some of the proposals resonate strongly with me. I'll be quite glad to see the template for new custom nodes. And I heard that Comfy mentioned maybe civit.ai or some other APIs can provide a similarity check so that the workflow users don't need to download the exact models everytime, which is very imaginative.
I would love to contribute to this thread. I have found this repo and found that it can download the models provided they are in the existing list and the added list. This can help with the missing models and inspire a way through for IP Adapters. I have made some changes in my repo to run this script before running ComfyUI.
When using ComfyUI and running run_with_gpu.bat, importing a JSON file may result in missing nodes. This issue can be easily fixed by opening the manager and clicking on "Install Missing Nodes," allowing us to check and install the required nodes.
However, this functionality does not extend to missing models or IP-adapters.
To address this, I suggest implementing a feature that allows the installation of missing models and other components directly from the manager tab when importing a JSON file. This would streamline the process and ensure all necessary components are installed seamlessly.
I would like to contribute and try fixing this issue.