nixified-ai / flake

A Nix flake for many AI projects
GNU Affero General Public License v3.0
624 stars 69 forks source link

Add ComfyUI #94

Open lboklin opened 2 months ago

lboklin commented 2 months ago

This is based on other people's work, most notably @fazo96's PR and @LoganBarnett's modifications to it.

Currently it is mostly an original implementation with a focus on making it easy to spin up a server without necessarily adding it to one's system config. It currently lacks a NixOS module, partly because @LoganBarnett is already putting a lot of work into getting that into nixpkgs.

This is a draft due to the following unsolved problems:

Airradda commented 2 months ago

I'll investigate more when I get back from work. but when trying to nix run .#comfyui-amd I am currently getting:

config.cudaSupport = true ``` Traceback (most recent call last): File "/nix/store/qs1f326f9cj5giz8k0fgmqdbcwhjn8ix-comfyui-unstable-2024-04-15/comfyui", line 76, in import execution File "/nix/store/qs1f326f9cj5giz8k0fgmqdbcwhjn8ix-comfyui-unstable-2024-04-15/execution.py", line 11, in import nodes File "/nix/store/qs1f326f9cj5giz8k0fgmqdbcwhjn8ix-comfyui-unstable-2024-04-15/nodes.py", line 21, in import comfy.diffusers_load File "/nix/store/qs1f326f9cj5giz8k0fgmqdbcwhjn8ix-comfyui-unstable-2024-04-15/comfy/diffusers_load.py", line 3, in import comfy.sd File "/nix/store/qs1f326f9cj5giz8k0fgmqdbcwhjn8ix-comfyui-unstable-2024-04-15/comfy/sd.py", line 5, in from comfy import model_management File "/nix/store/qs1f326f9cj5giz8k0fgmqdbcwhjn8ix-comfyui-unstable-2024-04-15/comfy/model_management.py", line 119, in total_vram = get_total_memory(get_torch_device()) / (1024 * 1024) ^^^^^^^^^^^^^^^^^^ File "/nix/store/qs1f326f9cj5giz8k0fgmqdbcwhjn8ix-comfyui-unstable-2024-04-15/comfy/model_management.py", line 88, in get_torch_device return torch.device(torch.cuda.current_device()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/nix/store/c8vn2rv4lv86sxp8p89vxx36pl8p0xcr-python3-3.11.9-env/lib/python3.11/site-packages/torch/cuda/__init__.py", line 787, in current_device _lazy_init() File "/nix/store/c8vn2rv4lv86sxp8p89vxx36pl8p0xcr-python3-3.11.9-env/lib/python3.11/site-packages/torch/cuda/__init__.py", line 302, in _lazy_init torch._C._cuda_init() RuntimeError: Found no NVIDIA driver on your system. Please check that you have an NVIDIA GPU and installed a driver from http://www.nvidia.com/Download/index.aspx ```
config.cudaSupport = false ``` Traceback (most recent call last): File "/nix/store/px5qmzc34cypqpl0p28yzgr8dz7gvwx8-comfyui-unstable-2024-04-15/comfyui", line 76, in import execution File "/nix/store/px5qmzc34cypqpl0p28yzgr8dz7gvwx8-comfyui-unstable-2024-04-15/execution.py", line 11, in import nodes File "/nix/store/px5qmzc34cypqpl0p28yzgr8dz7gvwx8-comfyui-unstable-2024-04-15/nodes.py", line 21, in import comfy.diffusers_load File "/nix/store/px5qmzc34cypqpl0p28yzgr8dz7gvwx8-comfyui-unstable-2024-04-15/comfy/diffusers_load.py", line 3, in import comfy.sd File "/nix/store/px5qmzc34cypqpl0p28yzgr8dz7gvwx8-comfyui-unstable-2024-04-15/comfy/sd.py", line 5, in from comfy import model_management File "/nix/store/px5qmzc34cypqpl0p28yzgr8dz7gvwx8-comfyui-unstable-2024-04-15/comfy/model_management.py", line 119, in total_vram = get_total_memory(get_torch_device()) / (1024 * 1024) ^^^^^^^^^^^^^^^^^^ File "/nix/store/px5qmzc34cypqpl0p28yzgr8dz7gvwx8-comfyui-unstable-2024-04-15/comfy/model_management.py", line 88, in get_torch_device return torch.device(torch.cuda.current_device()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/nix/store/m0lgwj7r0llsmcqqjfbswkja8n8n48hx-python3-3.11.9-env/lib/python3.11/site-packages/torch/cuda/__init__.py", line 787, in current_device _lazy_init() File "/nix/store/m0lgwj7r0llsmcqqjfbswkja8n8n48hx-python3-3.11.9-env/lib/python3.11/site-packages/torch/cuda/__init__.py", line 293, in _lazy_init raise AssertionError("Torch not compiled with CUDA enabled") AssertionError: Torch not compiled with CUDA enabled ```

nix build .#comfyui-amd works. however the resulting ./result/bin/comfyui fails with:

  File "/home/airradda/Git/nixified-ai/./result/bin/comfyui", line 2
    cd /nix/store/px5qmzc34cypqpl0p28yzgr8dz7gvwx8-comfyui-unstable-2024-04-15 && \
                                                                         ^
SyntaxError: leading zeros in decimal integer literals are not permitted; use an 0o prefix for octal integers
lboklin commented 2 months ago

I'll investigate more when I get back from work. but when trying to nix run .#comfyui-amd I am currently getting: config.cudaSupport = true config.cudaSupport = false

nix build .#comfyui-amd works. however the resulting ./result/bin/comfyui fails with:

  File "/home/airradda/Git/nixified-ai/./result/bin/comfyui", line 2
    cd /nix/store/px5qmzc34cypqpl0p28yzgr8dz7gvwx8-comfyui-unstable-2024-04-15 && \
                                                                         ^
SyntaxError: leading zeros in decimal integer literals are not permitted; use an 0o prefix for octal integers

Does config.cudaSupport = true give you trouble when you nix run .#comfyui-amd?

Anyhow I've removed it because I figured out the reason I had to add it.

Thanks for (presumably) testing with an AMD card.

  File "/home/airradda/Git/nixified-ai/./result/bin/comfyui", line 2
    cd /nix/store/px5qmzc34cypqpl0p28yzgr8dz7gvwx8-comfyui-unstable-2024-04-15 && \
                                                                         ^
SyntaxError: leading zeros in decimal integer literals are not permitted; use an 0o prefix for octal integers

I don't know why that would happen; it works on my end. What is your environment? NixOS?

Airradda commented 2 months ago

I'll investigate more when I get back from work. but when trying to nix run .#comfyui-amd I am currently getting: config.cudaSupport = true config.cudaSupport = false nix build .#comfyui-amd works. however the resulting ./result/bin/comfyui fails with:

  File "/home/airradda/Git/nixified-ai/./result/bin/comfyui", line 2
    cd /nix/store/px5qmzc34cypqpl0p28yzgr8dz7gvwx8-comfyui-unstable-2024-04-15 && \
                                                                         ^
SyntaxError: leading zeros in decimal integer literals are not permitted; use an 0o prefix for octal integers

Does config.cudaSupport = true give you trouble when you nix run .#comfyui-amd?

Yes, the resulting error was in the config.cudaSupport = true details.

Thanks for (presumably) testing with an AMD card.

Yes, this is being tested on a 6950 XT.

  File "/home/airradda/Git/nixified-ai/./result/bin/comfyui", line 2
    cd /nix/store/px5qmzc34cypqpl0p28yzgr8dz7gvwx8-comfyui-unstable-2024-04-15 && \
                                                                         ^
SyntaxError: leading zeros in decimal integer literals are not permitted; use an 0o prefix for octal integers

I don't know why that would happen; it works on my end. What is your environment? NixOS?

Yes, I'm on NixOS Unstable.


Not caused by this PR or Nixified-AI, but I am currently failing to build roctracer for both ROCM 5 and 6, so I can't test anymore until I find out why:

roctracer ``` @nix { "action": "setPhase", "phase": "unpackPhase" } Running phase: unpackPhase unpacking source archive /nix/store/zvww1d6zkf2gnva6j2ccd82axj075l4s-source source root is source @nix { "action": "setPhase", "phase": "patchPhase" } Running phase: patchPhase substituteStream(): WARNING: '--replace' is deprecated, use --replace-{fail,warn,quiet}. (file 'CMakeLists.txt') @nix { "action": "setPhase", "phase": "updateAutotoolsGnuConfigScriptsPhase" } Running phase: updateAutotoolsGnuConfigScriptsPhase @nix { "action": "setPhase", "phase": "configurePhase" } Running phase: configurePhase fixing cmake files... cmake flags: -DCMAKE_FIND_USE_SYSTEM_PACKAGE_REGISTRY=OFF -DCMAKE_FIND_USE_PACKAGE_REGISTRY=OFF -DCMAKE_EXPORT_NO_PACKAGE_REGISTRY=ON -DCMAKE_BUILD_TYPE=Release -DBUILD_TESTING=OFF -DCMAKE_INSTALL_LOCALEDIR=/nix/store/wzz3vjk8lbpr9j0ajb0dklrl6761m6hx-roctracer-5.7.1/share/locale -DCMAKE_INSTALL_LIBEXECDIR=/nix/store/wzz3vjk8lbpr9j0ajb0dklrl6761m6hx-roctracer-5.7.1/libexec -DCMAKE_INSTALL_LIBDIR=/nix/store/wzz3vjk8lbpr9j0ajb0dklrl6761m6hx-roctracer-5.7.1/lib -DCMAKE_INSTALL_DOCDIR=/nix/store/wzz3vjk8lbpr9j0ajb0dklrl6761m6hx-roctracer-5.7.1/share/doc/roctrace> -- The C compiler identification is GNU 12.3.0 -- The CXX compiler identification is GNU 12.3.0 -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Check for working C compiler: /nix/store/6m1vpb979mbzmiv3sqcvdjj73niz5a99-gcc-wrapper-12.3.0/bin/gcc - skipped -- Detecting C compile features -- Detecting C compile features - done -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: /nix/store/6m1vpb979mbzmiv3sqcvdjj73niz5a99-gcc-wrapper-12.3.0/bin/g++ - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done -- Performing Test CMAKE_HAVE_LIBC_PTHREAD -- Performing Test CMAKE_HAVE_LIBC_PTHREAD - Success -- Found Threads: TRUE CMake Deprecation Warning at /nix/store/xliqzmj3ky90pam67ppsgazw2cqbyphn-clr-5.7.1/lib/cmake/hip/hip-config.cmake:20 (cmake_minimum_required): Compatibility with CMake < 3.5 will be removed from a future version of CMake. Update the VERSION argument value or use a ... suffix to tell CMake that the project does not need compatibility with older versions. Call Stack (most recent call first): CMakeLists.txt:49 (find_package) CMake Deprecation Warning at /nix/store/xliqzmj3ky90pam67ppsgazw2cqbyphn-clr-5.7.1/lib/cmake/hip/hip-config-amd.cmake:21 (cmake_minimum_required): Compatibility with CMake < 3.5 will be removed from a future version of CMake. Update the VERSION argument value or use a ... suffix to tell CMake that the project does not need compatibility with older versions. Call Stack (most recent call first): /nix/store/xliqzmj3ky90pam67ppsgazw2cqbyphn-clr-5.7.1/lib/cmake/hip/hip-config.cmake:150 (include) CMakeLists.txt:49 (find_package) -- hip::amdhip64 is SHARED_LIBRARY -- /nix/store/6m1vpb979mbzmiv3sqcvdjj73niz5a99-gcc-wrapper-12.3.0/bin/g++: CLANGRT compiler options not supported. -- Found Python3: /nix/store/glfr70gi7hfaj50mwj2431p8bg60fhqw-python3-3.11.9/bin/python3.11 (found version "3.11.9") found components: Interpreter CMake Error at src/CMakeLists.txt:77 (find_file): Could not find HIP_RUNTIME_API_H using the following files: hip_runtime_api.h -- Configuring incomplete, errors occurred! ```
LoganBarnett commented 2 months ago

Thanks for the call out and your work on this!

I took the liberty of copying your added custom-nodes to my nixpkgs comfyui fork. While it's all still a draft I'm working at moving all of it out of my dotfiles. I await a reply from the original author for write permission to the branch.

lboklin commented 2 months ago

I'm experimenting with parametrising the whole flake over a configurable set of options by allowing the user to override a flake (nixified-cfg) with one of their own which holds their configuration along with a library of models.

This approach is currently the only way (as far as I can tell) to parametrise a flake (supply a flake with arguments, cf. https://github.com/NixOS/nix/issues/5663). It's a little clunky, but it's declarative, allowing one to manage one's own models and custom nodes (TBI) with their own personal flake, which seems like a neat way to organise ones configurations in any case.

Edit: forgot to link the default (for now) nixified-cfg: https://github.com/lboklin/nixified-cfg. Its readme has basic instructions, but basically you clone it and pass it in with --override-input nixified-cfg <cloned-cfg> when running comfyui.

Airradda commented 2 months ago

I've gotten all the ROCM stuff to build, so now I'm on to the actual Comfy-UI stuff. It seems to be trying to access a non-existent /var/lib/comfy-ui/user directory. It also appears to not be one of the directories configurable via flags (See Log 2). I'll do a more in depth run after I get back from work.

Log ``` Total VRAM 16368 MB, total RAM 32020 MB Set vram state to: NORMAL_VRAM Device: cuda:0 AMD Radeon RX 6950 XT : native VAE dtype: torch.float32 Using sub quadratic optimization for cross attention, if you have memory or speed issues try using: --use-split-cross-attention Setting temp directory to: /var/lib/comfyui/temp/temp Traceback (most recent call last): File "/nix/store/cmbxwngv3w1mqjvc5ab32yf8kj197q1c-comfyui-unstable-2024-04-15/comfyui", line 206, in server = server.PromptServer(loop) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/nix/store/cmbxwngv3w1mqjvc5ab32yf8kj197q1c-comfyui-unstable-2024-04-15/server.py", line 70, in __init__ self.user_manager = UserManager() ^^^^^^^^^^^^^ File "/nix/store/cmbxwngv3w1mqjvc5ab32yf8kj197q1c-comfyui-unstable-2024-04-15/app/user_manager.py", line 20, in __init__ os.mkdir(user_directory) FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/comfyui/user' ```
Log 2 `nix run .#comfyui-amd -- --input-directory /home/airradda/Documents/Comfy-UI/Inputs --output-directory /home/airradda/Documents/Comfy-UI/Output --temp-directory /home/airradda/Documents/Comfy-UI/Temp` ``` Total VRAM 16368 MB, total RAM 32020 MB Set vram state to: NORMAL_VRAM Device: cuda:0 AMD Radeon RX 6950 XT : native VAE dtype: torch.float32 Using sub quadratic optimization for cross attention, if you have memory or speed issues try using: --use-split-cross-attention Setting temp directory to: /home/airradda/Documents/Comfy-UI/Temp/temp Traceback (most recent call last): File "/nix/store/cmbxwngv3w1mqjvc5ab32yf8kj197q1c-comfyui-unstable-2024-04-15/comfyui", line 206, in server = server.PromptServer(loop) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/nix/store/cmbxwngv3w1mqjvc5ab32yf8kj197q1c-comfyui-unstable-2024-04-15/server.py", line 70, in __init__ self.user_manager = UserManager() ^^^^^^^^^^^^^ File "/nix/store/cmbxwngv3w1mqjvc5ab32yf8kj197q1c-comfyui-unstable-2024-04-15/app/user_manager.py", line 20, in __init__ os.mkdir(user_directory) FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/comfyui/user' ```
lboklin commented 2 months ago

I've gotten all the ROCM stuff to build, so now I'm on to the actual Comfy-UI stuff. It seems to be trying to access a non-existent /var/lib/comfy-ui/user directory. It also appears to not be one of the directories configurable via flags (See Log 2). I'll do a more in depth run after I get back from work. Log Log 2

I think the problem makes sense because /var/lib is mutable and shouldn't be accessed during build time. I'm working on model collections so that you basically build a collection of everything in the config flake and then comfyui points to that collection in its extra_model_paths.yaml

lboklin commented 2 months ago

Just a heads-up: I'm restructuring a bunch and things are quite broken atm.

lboklin commented 2 months ago

As of this moment, this should work (for nvidia) and give a minimal setup: nix run github:lboklin/nixified-ai#comfyui-nvidia --override-input nixified-cfg github:lboklin/nixified-cfg/eb69f4c62fa0ce19eee2e8a4a5d601176a398bfd (most recent commit on comfyui-minimal branch as of now).

Airradda commented 2 months ago

I swapped the hardcoded paths in the package.nix for some user accessible ones in ~/Documents/Comfy-UI and, using your input override, I have successfully generated an image. This includes full use of the GPU during generation.

lboklin commented 2 months ago

I swapped the hardcoded paths in the package.nix for some user accessible ones in ~/Documents/Comfy-UI and, using your input override, I have successfully generated an image. This includes full use of the GPU during generation.

Do you mean the subdirectories of the models? I suppose those ought to be configurable as well. Hadn't thought of that!

Airradda commented 2 months ago

I tried get a readable diff to show, but I hardcoded the config-data.comfyui.base_path and the inputPath, outputPath, tempPath, and userPath.

projects/comfyui/package.nix --- 1/4 --- Nix 7 , stdenv 8 , symlinkJoin 9 , config 10 , modelsPath ? "/home/airradda/Documents/Comfy-UI/Models" 11 , inputPath ? "/home/airradda/Documents/Comfy-UI/Input" 12 , outputPath ? "/home/airradda/Documents/Comfy-UI/Output" 13 , tempPath ? "/home/airradda/Documents/Comfy-UI/Temp" 14 , userPath ? "/home/airradda/Documents/Comfy-UI/User" 15 , customNodes 16 , models 17 }:

projects/comfyui/package.nix --- 2/4 --- Nix 20 20 21 config-data = { 21 config-data = { 22 comfyui = { 22 comfyui = { 23 base_path = modelsPath; 23 base_path = "/home/airradda/Documents/Comfy-UI/Models"; 24 checkpoints = "${modelsPath}/checkpoints"; 24 checkpoints = "${modelsPath}/checkpoints"; 25 clip = "${modelsPath}/clip"; 25 clip = "${modelsPath}/clip"; 26 clip_vision = "${modelsPath}/clip_vision"; 26 clip_vision = "${modelsPath}/clip_vision";

projects/comfyui/package.nix --- 3/4 --- Nix 58 tqdm 58 tqdm 59 ] ++ (builtins.concatMap (node: node.dependencies) customNodes))); 59 ] ++ (builtins.concatMap (node: node.dependencies) customNodes))); 60 60 61 executable = writers.writeDashBin "comfyui" '' 61 executable = writers.writeDashBin "comfyui" '' 62 cd $out && \ 62 cd $out && \ 63 ${pythonEnv}/bin/python comfyui \ 63 ${pythonEnv}/bin/python comfyui \ 64 --input-directory ${inputPath} \ 64 --input-directory "/home/airradda/Documents/Comfy-UI/Input" \ 65 --output-directory ${outputPath} \ 65 --output-directory "/home/airradda/Documents/Comfy-UI/Output" \ 66 --extra-model-paths-config ${modelPathsFile} \ 66 --extra-model-paths-config ${modelPathsFile} \ 67 --temp-directory ${tempPath} \ 67 --temp-directory "/home/airradda/Documents/Comfy-UI/Temp" \ 68 "$@" 68 "$@" 69 ''; 69 '';

lboklin commented 2 months ago

@Airradda you can generate a diff with git diff --patch

Edit: also, looks like you are on a slightly older commit (unless you added the defaults yourself - I recently removed them)

lboklin commented 2 months ago

So the idea with the declarative model management is that you customise your set of models in the cfg flake, here. You can of course modify the nixified-ai like you did and manage models yourself, mutably; but in this PR my goal is to make that unnecessary because all of that would be handled the nix way, and it should be easy to add what you need right in one's own cfg flake.

lboklin commented 2 months ago

I hope I didn't lie by stating in the comments at the top of the file that downloads are cached even if the checksum fails to match. It seemed like it was in one case but in another it seemed to redownload the whole thing once I added the correct one. That is of course less than ideal because some models can be very large, and you really have no good way (afaik) of getting the checksum before deliberately using an incorrect one.

Airradda commented 2 months ago

@Airradda you can generate a diff with git diff --patch

Edit: also, looks like you are on a slightly older commit (unless you added the defaults yourself - I recently removed them)

That is what I posted came from.

The commit is from last night. I explicitly added them back in before hardcoding them as the default weren't applying.

Airradda commented 2 months ago

So the idea with the declarative model management is that you customise your set of models in the cfg flake, here. You can of course modify the nixified-ai like you did and manage models yourself, mutably; but in this PR my goal is to make that unnecessary because all of that would be handled the nix way, and it should be easy to add what you need right in one's own cfg flake.

I was also trying to get a minimum setup going before messing with the cfg/nix stuff. The setup I had before, running in a rocm-pytorch container with podman, broke so I was trying to have minimal downtime.

Tonight, I will probably start messing with and/or move to the cfg based setup and see how that goes.

lboklin commented 2 months ago

I've progressed on implementing custom nodes management in addition to models, but there is a hurdle: custom nodes can't have their own dependencies and I don't know how to solve the problem cleanly.

Anyway, I've been trying to make a config that has everything the Krita AI plugin needs, and all requirements are met except one (controlnet_aux) due to dependencies. If one is eager, they can be added manually to the comfyui package.

I'm doing this on a separate branch because there is a minor change in the config "api" that I just haven't synced across both projects yet outside of these two branches (https://github.com/lboklin/nixified-ai/tree/comfyui-krita-requirements and https://github.com/lboklin/nixified-cfg/tree/comfyui-krita-requirements).

LoganBarnett commented 2 months ago

@lboklin I'm not behind my computer (to provide links) but I've been doing custom nodes in my dotfiles, and I've been doing dependencies in there. These dependencies are bundled up to become ComfyUI's dependencies. Does that solve your issue?

My nixpkgs branch should also be demonstrating this.

lboklin commented 2 months ago

@LoganBarnet I'm also not at the computer atm, but I was using your nixpkgs fork as reference. If I try to do it the same way I get "... is a string with context when a set was expected". One of your comments mentioned this problem with linkFarm, so maybe an alternative to that could be concocted, but my intuition is that the problem is greater than that. I'll have a look at it again tomorrow.

lboklin commented 2 months ago

Alright, I solved the problem with custom node dependencies. For one, I had to set the derivations' passthru.dependencies, not merely dependencies (they were never preserved); but that they weren't propagated was occluded by the previously mentioned error which was caused by cfg.models and cfg.customNodes in the configuration flake being set to strings with the outPath of the respective packages rather than their derivations.

So both declarative model management and custom nodes seem to work now. Something is missing for the Krita plugin still so I'm looking into that now, and after that I'll redirect my attention to the NixOS module.

lboklin commented 1 month ago

Based on feedback I've moved everything back into this main repo and added a basic package for running a krita-ai server (packages.krita-comfyui-server-"${gpuVendor}" / legacyPackages.comfyui."${gpuVendor}".kritaServer) as well as some general functions in legacyPackages.comfyui."${gpuVendor}":

Example usage of withPlugins from the commandline: nix build --impure --expr '(builtins.getFlake "github:lboklin/nixified-ai").legacyPackages.x86_64-linux.comfyui.nvidia.withPlugins (ms: { checkpoints = { inherit (ms.checkpoints) DreamShaper_8_pruned; }; }) (ns: { inherit (ns) controlnet-aux; })'

Airradda commented 1 month ago

I have built and generated an image using

nix build --impure --expr '(builtins.getFlake "github:lboklin/nixified-ai").legacyPackages.x86_64-linux.comfyui.amd.withPlugins (ms: { checkpoints = { inherit (ms.checkpoints) DreamShaper_8_pruned; }; }) (ns: { inherit (ns) controlnet-aux; })'.

and I can confirm it is properly making full use of my AMD GPU.

lboklin commented 1 month ago

I've added a bunch of models and changed the outputs a bit.

Examples of how one could use the new outputs:

# run a server with absolutely all models and custom nodes:
nix run --impure --expr 'with (builtins.getFlake "github:lboklin/nixified-ai").legacyPackages.x86_64-linux.comfyui.{nvidia,amd}; withPlugins (plugins: plugins)'
# all the checkpoint models but no custom nodes
nix run --impure --expr 'with (builtins.getFlake "github:lboklin/nixified-ai").legacyPackages.x86_64-linux.comfyui.{nvidia,amd}; withPlugins ({ models, customNodes }: { customNodes = {}; models = { inherit (models) checkpoints; }; })'
# run a krita ai server with all optional models included (controlnets and such):
nix run github:lboklin/nixified-ai#krita-comfyui-server-{nvidia,amd}
# run a minimal krita ai server with a custom model set (but please actually put the expression in a file):
nix run --impure --expr 'with (builtins.getFlake "github:lboklin/nixified-ai").legacyPackages.x86_64-linux.comfyui.{nvidia,amd}; kritaServerWithModels (ms: with ms; { checkpoints = { inherit (checkpoints) colossus-xl-v6; }; ipadapter = { inherit (ipadapter) ip-adapter-faceid-plusv2_sdxl; }; loras = { inherit (loras) ip-adapter-faceid-plusv2_sdxl_lora; }; })'

It seems to me like it's not uncommon for custom nodes to require certain models, so I added a passthru for that as well.

reactor-node is broken because it tries to write to the models dir, but I'm leaving it there.

Airradda commented 1 month ago

nix run --impure --expr 'with (builtins.getFlake "github:lboklin/nixified-ai").legacyPackages.x86_64-linux.comfyui.{nvidia,amd}; withPlugins ({ models, customNodes }: { customNodes = {}; models = { inherit (models) checkpoints; }; })' is the only functioning command for me. The others error out because of albumentations, which produces the following error log. Aside from that, I am able to generate an image and will start trying migrating from my current nixifiedai-cfg setup see if I come across anything else.

Error Log ``` error: builder for '/nix/store/dc59xb4vh3x8ihxqpbnmk3d7l6y5nviv-python3.11-albumentations-1.4.2.drv' failed with exit code 139; last 10 log lines: > File "/nix/store/yqcv2gvxqjw25ypg2vaixdfl6qcsgpna-python3.11-pluggy-1.4.0/lib/python3.11/site-packages/pluggy/_hooks.py", line 501 in __call__ > File "/nix/store/m34m9sb8z84ldimp3wzwwrz08l7w66ly-python3.11-pytest-8.0.2/lib/python3.11/site-packages/_pytest/config/__init__.py", line 175 in main > File "/nix/store/m34m9sb8z84ldimp3wzwwrz08l7w66ly-python3.11-pytest-8.0.2/lib/python3.11/site-packages/_pytest/config/__init__.py", line 198 in console_main > File "/nix/store/m34m9sb8z84ldimp3wzwwrz08l7w66ly-python3.11-pytest-8.0.2/lib/python3.11/site-packages/pytest/__main__.py", line 7 in > File "", line 88 in _run_code > File "", line 198 in _run_module_as_main > > Extension modules: numpy.core._multiarray_umath, numpy.core._multiarray_tests, numpy.linalg._umath_linalg, numpy.fft._pocketfft_internal, numpy.random._common, numpy.random.bit_generator, numpy.random._bounded_integers, numpy.random._mt19937, numpy.random.mtrand, numpy.random._philox, numpy.random._pcg64, numpy.random._sfc64, numpy.random._generator, cv2, skimage._shared.geometry, yaml._yaml, scipy._lib._ccallback_c, scipy.ndimage._nd_image, scipy.special._ufuncs_cxx, scipy.special._ufuncs, scipy.special._specfun, scipy.special._comb, scipy.linalg._fblas, scipy.linalg._flapack, scipy.linalg.cython_lapack, scipy.linalg._cythonized_array_utils, scipy.linalg._solve_toeplitz, scipy.linalg._flinalg, scipy.linalg._decomp_lu_cython, scipy.linalg._matfuncs_sqrtm_triu, scipy.linalg.cython_blas, scipy.linalg._matfuncs_expm, scipy.linalg._decomp_update, scipy.sparse._sparsetools, _csparsetools, scipy.sparse._csparsetools, scipy.sparse.linalg._dsolve._superlu, scipy.sparse.linalg._eigen.arpack._arpack, scipy.sparse.csgraph._tools, scipy.sparse.csgraph._shortest_path, scipy.sparse.csgraph._traversal, scipy.sparse.csgraph._min_spanning_tree, scipy.sparse.csgraph._flow, scipy.sparse.csgraph._matching, scipy.sparse.csgraph._reordering, scipy.special._ellip_harm_2, _ni_label, scipy.ndimage._ni_label, scipy.spatial._ckdtree, scipy._lib.messagestream, scipy.spatial._qhull, scipy.spatial._voronoi, scipy.spatial._distance_wrap, scipy.spatial._hausdorff, scipy.spatial.transform._rotation, sklearn.__check_build._check_build, lz4._version, lz4.frame._frame, psutil._psutil_linux, psutil._psutil_posix, scipy.optimize._minpack2, scipy.optimize._group_columns, scipy.optimize._trlib._trlib, scipy.optimize._lbfgsb, _moduleTNC, scipy.optimize._moduleTNC, scipy.optimize._cobyla, scipy.optimize._slsqp, scipy.optimize._minpack, scipy.optimize._lsq.givens_elimination, scipy.optimize._zeros, scipy.optimize._highs.cython.src._highs_wrapper, scipy.optimize._highs._highs_wrapper, scipy.optimize._highs.cython.src._highs_constants, scipy.optimize._highs._highs_constants, scipy.linalg._interpolative, scipy.optimize._bglu_dense, scipy.optimize._lsap, scipy.optimize._direct, scipy.integrate._odepack, scipy.integrate._quadpack, scipy.integrate._vode, scipy.integrate._dop, scipy.integrate._lsoda, scipy.special.cython_special, scipy.stats._stats, scipy.stats.beta_ufunc, scipy.stats._boost.beta_ufunc, scipy.stats.binom_ufunc, scipy.stats._boost.binom_ufunc, scipy.stats.nbinom_ufunc, scipy.stats._boost.nbinom_ufunc, scipy.stats.hypergeom_ufunc, scipy.stats._boost.hypergeom_ufunc, scipy.stats.ncf_ufunc, scipy.stats._boost.ncf_ufunc, scipy.stats.ncx2_ufunc, scipy.stats._boost.ncx2_ufunc, scipy.stats.nct_ufunc, scipy.stats._boost.nct_ufunc, scipy.stats.skewnorm_ufunc, scipy.stats._boost.skewnorm_ufunc, scipy.stats.invgauss_ufunc, scipy.stats._boost.invgauss_ufunc, scipy.interpolate._fitpack, scipy.interpolate.dfitpack, scipy.interpolate._bspl, scipy.interpolate._ppoly, scipy.interpolate.interpnd, scipy.interpolate._rbfinterp_pythran, scipy.interpolate._rgi_cython, scipy.stats._biasedurn, scipy.stats._levy_stable.levyst, scipy.stats._stats_pythran, scipy._lib._uarray._uarray, scipy.stats._ansari_swilk_statistics, scipy.stats._sobol, scipy.stats._qmc_cy, scipy.stats._mvn, scipy.stats._rcont.rcont, scipy.stats._unuran.unuran_wrapper, sklearn.utils._isfinite, sklearn.utils.murmurhash, sklearn.utils._openmp_helpers, sklearn.utils.sparsefuncs_fast, sklearn.utils._random, sklearn.utils._seq_dataset, sklearn.metrics.cluster._expected_mutual_info_fast, sklearn.preprocessing._csr_polynomial_expansion, sklearn.preprocessing._target_encoder_fast, sklearn.metrics._dist_metrics, sklearn.metrics._pairwise_distances_reduction._datasets_pair, sklearn.utils._cython_blas, sklearn.metrics._pairwise_distances_reduction._base, sklearn.metrics._pairwise_distances_reduction._middle_term_computer, sklearn.utils._heap, sklearn.utils._sorting, sklearn.metrics._pairwise_distances_reduction._argkmin, sklearn.metrics._pairwise_distances_reduction._argkmin_classmode, sklearn.utils._vector_sentinel, sklearn.metrics._pairwise_distances_reduction._radius_neighbors, sklearn.metrics._pairwise_distances_reduction._radius_neighbors_classmode, sklearn.metrics._pairwise_fast, sklearn.linear_model._cd_fast, sklearn._loss._loss, sklearn.utils.arrayfuncs, sklearn.svm._liblinear, sklearn.svm._libsvm, sklearn.svm._libsvm_sparse, sklearn.utils._weight_vector, sklearn.linear_model._sgd_fast, sklearn.linear_model._sag_fast, sklearn.decomposition._online_lda_fast, sklearn.decomposition._cdnmf_fast, skimage.measure._ccomp (total: 155) > /nix/store/46c2xhjgxmvdxk69rpdxdkxz3c3dshdi-pytest-check-hook/nix-support/setup-hook: line 53: 401 Segmentation fault (core dumped) /nix/store/gd3shnza1i50zn8zs04fa729ribr88m9-python3-3.11.8/bin/python3.11 -m pytest -k "not test_transforms" > /nix/store/v5lsd029lz5lfhamivbgqyp3zdv94ah2-stdenv-linux/setup: line 1578: pop_var_context: head of shell_variables not a function context For full logs, run 'nix log /nix/store/dc59xb4vh3x8ihxqpbnmk3d7l6y5nviv-python3.11-albumentations-1.4.2.drv'. error: 1 dependencies of derivation '/nix/store/yjkzz7m7rzfg33n8hak2cx7vbns373fn-python3-3.11.8-env.drv' failed to build error: 1 dependencies of derivation '/nix/store/ing7sbwax4f25i02wpqcdsfzg4iadr25-comfyui.drv' failed to build error: 1 dependencies of derivation '/nix/store/g4i137cgzbkdyz0a02i38avcz7b5sjyb-comfyui-unstable-2024-04-15.drv' failed to build ```
lboklin commented 1 month ago

nix run --impure --expr 'with (builtins.getFlake "github:lboklin/nixified-ai").legacyPackages.x86_64-linux.comfyui.{nvidia,amd}; withPlugins ({ models, customNodes }: { customNodes = {}; models = { inherit (models) checkpoints; }; })' is the only functioning command for me. The others error out because of albumentations, which produces the following error log. Aside from that, I am able to generate an image and will start trying migrating from my current nixifiedai-cfg setup see if I come across anything else. Error Log

I recently added that dependency (and others) not because anything was broken but because it was listed in the relevant project's requirements.txt. You should be able to simply remove it if you just want to get past that error.

Airradda commented 1 month ago

Yeah, I just removed it from controlnet-aux and disabled reactor-node.

On a separate note, in your opinion, what would be the best way to handle extensions and/or alternative UIs?

lboklin commented 1 month ago

Yeah, I just removed it from controlnet-aux and disabled reactor-node.

On a separate note, in your opinion, what would be the best way to handle extensions and/or alternative UIs?

I suppose a similar treatment as for custom nodes and models; at the top-most level we could add an extensions and a frontend attribute to the "plugins" interface. I imagine we might want to prepare more comprehensive sets/plugins that include things from more than one category, or, just like how I implemented model dependencies for custom nodes we could add the other categories as possible dependencies so that adding the custom nodes from https://github.com/blib-la/blibla-comfyui-extensions would also pull in the extensions.

Airradda commented 2 weeks ago

The current example is failing for me with this missing attribute error:

First Error ``` error: … while evaluating the file '/home/airradda/Documents/nixified-ai/my-comfyui.nix': … in the left operand of the update (//) operator at /home/airradda/Documents/nixified-ai/my-comfyui.nix:5:58: 4| vendor = "amd"; 5| pkgs = legacyPackages.x86_64-linux.comfyui."${vendor}" // { comfyui = packages.x86_64-linux."comfyui-${vendor}"; }; | ^ 6| in error: attribute 'comfyui' missing at /home/airradda/Documents/nixified-ai/my-comfyui.nix:5:10: 4| vendor = "amd"; 5| pkgs = legacyPackages.x86_64-linux.comfyui."${vendor}" // { comfyui = packages.x86_64-linux."comfyui-${vendor}"; }; | ^ 6| in ```

And a nix flake show --legacy seems to indicate this is correct. However removing the left operand and // operator leads to a similar missing attribute for the right operand, but packages.x86_64-linux.comfyui-amd does show up in the nix flake show --legacy. I assumed this was an issue with my local repo, so I pulled a fresh copy and get the same error. This does not appear to occur for krita-comfyui-server, both of which fail for different reasons.

nix flake show --legacy ``` git+file:///home/airradda/Documents/nixified-ai?ref=refs/heads/master&rev=c27f8ff8fbef21bd95b04b788d2e826ce03aa673 ├───allSystems: unknown ├───checks │ └───x86_64-linux │ └───github-pages-effect-is-buildable: derivation 'effect-write-branch' ├───debug: unknown ├───formatter │ └───x86_64-linux: package 'alejandra-3.0.0' ├───herculesCI: unknown ├───legacyPackages │ └───x86_64-linux ├───lib: unknown ├───nixosModules │ ├───invokeai: NixOS module │ ├───invokeai-amd: NixOS module │ ├───invokeai-nvidia: NixOS module │ ├───textgen: NixOS module │ ├───textgen-amd: NixOS module │ └───textgen-nvidia: NixOS module ├───overlays │ ├───python-bitsAndBytesOldGpu: Nixpkgs overlay │ ├───python-fixPackages: Nixpkgs overlay │ ├───python-pythonFinal: Nixpkgs overlay │ ├───python-torchCuda: Nixpkgs overlay │ └───python-torchRocm: Nixpkgs overlay └───packages └───x86_64-linux ├───comfyui-amd: package 'comfyui-unstable-2024-06-12' ├───comfyui-nvidia: package 'comfyui-unstable-2024-06-12' ├───invokeai-amd: package 'python3.11-InvokeAI-3.3.0post3' ├───invokeai-nvidia: package 'python3.11-InvokeAI-3.3.0post3' ├───krita-comfyui-server-amd: package 'comfyui-unstable-2024-06-12' ├───krita-comfyui-server-amd-minimal: package 'comfyui-unstable-2024-06-12' ├───krita-comfyui-server-nvidia: package 'comfyui-unstable-2024-06-12' ├───krita-comfyui-server-nvidia-minimal: package 'comfyui-unstable-2024-06-12' ├───textgen-nvidia: package 'textgen' └───website: package 'nixified-ai-website-1.0.0' ```
Second Error ``` error: … while evaluating the file '/home/airradda/Documents/nixified-ai/my-comfyui.nix': error: attribute 'comfyui-amd' missing at /home/airradda/Documents/nixified-ai/my-comfyui.nix:5:10: 4| vendor = "amd"; 5| pkgs = packages.x86_64-linux."comfyui-${vendor}"; | ^ 6| in ```
lboklin commented 2 weeks ago

@Airradda I suspect you forgot to chance "github:nixified-ai/flake" to "github:lboklin/nixified-ai"

Anyway, I'm working on adding a proper template. I've also refactored a few things. Not sure if it's better in the end, but hopefully it turns out ok. I'll probably push it as one big commit.

lboklin commented 2 weeks ago

A simple solution to the last blocker is that the broken projects are simply left broken, with a note in the readme that they are unmaintained and explicitly instruct to use the last revision of the flake in which they still worked. It's beyond the scope of this PR to update and fix those projects, and the reason that they are broken, after all, is that they are indeed unmaintained.

I'm waiting for a final review before squashing and un-drafting this PR.

Airradda commented 2 weeks ago

Damn, just one of those days I guess. You were correct. Anyway, here are some things from testing from with a fresh new template:

Issues:

'VAE' object has no attribute 'vae_dtype'

File "/nix/store/lk4ds5rvlskiaxvqrgm728crwn9yja4m-comfyui-unstable-2024-06-12/execution.py", line 151, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/nix/store/lk4ds5rvlskiaxvqrgm728crwn9yja4m-comfyui-unstable-2024-06-12/execution.py", line 81, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/nix/store/lk4ds5rvlskiaxvqrgm728crwn9yja4m-comfyui-unstable-2024-06-12/execution.py", line 74, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/nix/store/lk4ds5rvlskiaxvqrgm728crwn9yja4m-comfyui-unstable-2024-06-12/nodes.py", line 268, in decode return (vae.decode(samples["samples"]), ) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/nix/store/lk4ds5rvlskiaxvqrgm728crwn9yja4m-comfyui-unstable-2024-06-12/comfy/sd.py", line 300, in decode memory_used = self.memory_used_decode(samples_in.shape, self.vae_dtype) ^^^^^^^^^^^^^^

This does not occur in [c27f8ff](https://github.com/nixified-ai/flake/pull/94/commits/c27f8ff8fbef21bd95b04b788d2e826ce03aa673) 

Other Notes:
1. I was able to build the flake and generate with all the models without a VAE, and it is still properly using my AMD GPU.
2. I think it would be good to have an example or comments for using separate files for models and custom nodes as they can very quickly get quite large (Mine is 1000+ lines).
3. And/or an example of how to use `fetchFromUrl`, I got it working in what feels like a very hackish way by doing:
```nix
myModels = with {inherit (import ./meta.nix) base-models model-types;}; let
  fetchFromUrl = import <nix/fetchurl.nix>;
in {

flake.nix ```nix { description = "Description for the project"; inputs = { nixified-ai.url = "github:lboklin/nixified-ai"; # no need to depend on different versions of the same inputs flake-parts.follows = "nixified-ai/flake-parts"; nixpkgs.follows = "nixified-ai/nixpkgs"; }; outputs = inputs @ {flake-parts, ...}: flake-parts.lib.mkFlake {inherit inputs;} { systems = ["x86_64-linux"]; perSystem = { config, self', inputs', pkgs, system, lib, ... }: let vendor = "amd"; comfyui = inputs'.nixified-ai.packages."comfyui-${vendor}"; myModels = with {inherit (import ./meta.nix) base-models model-types;}; let fetchFromUrl = import ; in { good-model = with inputs'.nixified-ai.legacyPackages.comfyui; { installPath = "checkpoints/good-model.safetensors"; src = pkgs.writeText "fake-good-model.safetensors" ""; # src = fetchFromHuggingFace { # owner = "good-person"; # repo = "good-models"; # resource = "good/model.safetensors"; # # leave as `""` and build once to get actual hash # sha256 = ""; # }; type = model-types.checkpoint; base = base-models.sd3-medium; }; pony-diffusion-v6-xl-vae = with inputs'.nixified-ai.legacyPackages.comfyui; { installPath = "vae/pony-diffusion-v6-xl-vae.safetensors"; src = fetchFromUrl { url = "https://civitai.com/api/download/models/290640"; sha256 = "sha256-Z6sv2OxDmomz/tsVzGX1QzavFjx+teTyrMmPCQopsLM="; }; type = model-types.vae; base = base-models.sdxl; }; pony-diffusion-v6-xl = with inputs'.nixified-ai.legacyPackages.comfyui; { installPath = "checkpoints/pony-diffusion-v6-xl.safetensors"; src = fetchFromUrl { url = "https://civitai.com/api/download/models/290640"; sha256 = "sha256-Z6sv2OxDmomz/tsVzGX1QzavFjx+teTyrMmPCQopsLM="; }; type = model-types.checkpoint; base = base-models.sdxl; }; realisticVisionV51_v51VAE = with inputs'.nixified-ai.legacyPackages.comfyui; { installPath = "checkpoints/realisticVisionV51_v51VAE.safetensors"; src = fetchFromHuggingFace { owner = "lllyasviel"; repo = "fav_models"; resource = "fav/realisticVisionV51_v51VAE.safetensors"; sha256 = "sha256-FQEsU49QPOLr/CyFR7Jox1zNr/eigdtVOZlA/x1w4h0="; }; type = model-types.checkpoint; base = base-models.sd15; }; juggernautXL_version6Rundiffusion = with inputs'.nixified-ai.legacyPackages.comfyui;{ installPath = "checkpoints/juggernautXL_version6Rundiffusion.safetensors"; src = fetchFromHuggingFace { owner = "lllyasviel"; repo = "fav_models"; resource = "fav/juggernautXL_version6Rundiffusion.safetensors"; sha256 = "sha256-H+bH7FTHhgQM2rx7TolyAGnZcJaSLiDQHxPndkQStH8="; }; type = model-types.checkpoint; base = base-models.sdxl; }; }; in { packages.default = self'.packages.comfyui; packages.comfyui = comfyui.override rec{ # good-model is a dependency of good-node, so it will be added anyway models = myModels; customNodes = { # random example node which we say requires good-model good-node = pkgs.stdenv.mkDerivation { pname = "good-node"; version = "0.0.0"; src = pkgs.fetchFromGitHub { owner = "Suzie1"; repo = "ComfyUI_Guide_To_Making_Custom_Nodes"; rev = "9d64214cfd53a89769541c645f921f0acb0c38f1"; sha256 = "sha256-ol9Ep+K/LOaDJxWEBP+tENWCaEw8EnzlEEDtaPurmBE="; }; installPhase = '' runHook preInstall mkdir -p $out/ cp -r $src/* $out/ runHook postInstall ''; passthru.dependencies = { pkgs = []; models = {inherit (myModels) good-model;}; }; }; }; # outputPath = "/tmp/comfyui-output"; # these are the defaults basePath = "/home/airradda/Documents/Comfy-UI"; inputPath = "${basePath}/input"; outputPath = "${basePath}/output"; tempPath = "${basePath}/temp"; userPath = "${basePath}/user"; }; }; }; } ```
lboklin commented 2 weeks ago

Damn, just one of those days I guess.

Haha, no worries.

Issues:

* The user defined [paths](https://github.com/lboklin/nixified-ai/blob/7850a9216df1a147f277eac40e9b3f8a3012467e/templates/comfyui/flake.nix#L74-L77) in the template, can't use `basePath` with adding a `rec` to the [`comfyui.override`](https://github.com/lboklin/nixified-ai/blob/7850a9216df1a147f277eac40e9b3f8a3012467e/templates/comfyui/flake.nix#L43)

I took it out just to stop my linter from complaining about it being unnecessary when those lines are commented out, but it's a good point that one should be able to uncomment them and get no errors. Fixed now.

* This probably from the hackish way I did it (See "Other Notes 3.) , but if the URL for `fetchFromUrl` contains a `?` then mkdir fails  

It's because the implementation of taking the basename of the url isn't clever enough to strip those characters, it would seem, so I fixed it by passing names explicitly to the fetchers.

* After fixing the above issue the resulting VAE fails when used with:
Error occurred when executing VAEDecode:

'VAE' object has no attribute 'vae_dtype'

  File "/nix/store/lk4ds5rvlskiaxvqrgm728crwn9yja4m-comfyui-unstable-2024-06-12/execution.py", line 151, in recursive_execute
    output_data, output_ui = get_output_data(obj, input_data_all)
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nix/store/lk4ds5rvlskiaxvqrgm728crwn9yja4m-comfyui-unstable-2024-06-12/execution.py", line 81, in get_output_data
    return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nix/store/lk4ds5rvlskiaxvqrgm728crwn9yja4m-comfyui-unstable-2024-06-12/execution.py", line 74, in map_node_over_list
    results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nix/store/lk4ds5rvlskiaxvqrgm728crwn9yja4m-comfyui-unstable-2024-06-12/nodes.py", line 268, in decode
    return (vae.decode(samples["samples"]), )
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/nix/store/lk4ds5rvlskiaxvqrgm728crwn9yja4m-comfyui-unstable-2024-06-12/comfy/sd.py", line 300, in decode
    memory_used = self.memory_used_decode(samples_in.shape, self.vae_dtype)
                                                            ^^^^^^^^^^^^^^

This does not occur in c27f8ff

Hmm, I don't know what causes this. I'm using it without issue, so it must be something to do with other nodes or models, or else it is system-specific (or a flag passed to comfyui?).

I think it would be good to have an example or comments for using separate files for models and custom nodes as they can very quickly get quite large (Mine is 1000+ lines).

Noted. I also want to add an example of using a locally downloaded model with fixed hash. I know how to do it outside of a flake but haven't tried with a flake:

    sd3-medium-incl-clips = {
      installPath = "checkpoints/stableDiffusion3SD3_sd3MediumInclClips.safetensors";
      src = pkgs.requireFile {
        url = "file:///path/to/stableDiffusion3SD3_sd3MediumInclClips.safetensors";
        sha256 = "108578x7cfwcxfys8rjiklq24m5ifvra4y7byhh04igvqldz5drv";
      };
      type = model-types.checkpoint;
      base = base-models.sd3-medium;