-
Thanks for the amazing work. I am now trying to run the _train_adv_img_trans.py code and get an unexpected segmentation error. After locating the error location, I believe it is from the `clip_model.e…
-
1. Tested on Forge server. If cable block gets exploded with any kind of explosion, a big propeller model gets spawned at the location of each player. This model is clippable-through and isn't used an…
-
I tried two gguf conversion on M2 ultra (metal) but no luck. I converted them myself and still the same error.
Here is the first model I tried:
https://huggingface.co/guinmoon/MobileVLM-1.7B-GGUF…
-
I have meet an error when using demo.py
`(fcclip) ga@test-4U-GPU-Server:~/code/fc-clip$ python demo/demo.py --input 000741.jpg 000860.jpg --opts MODEL.WEIGHTS fcclip_cocopan.pth
[07/15 15:09:43 …
-
Hi,
I noticed something when loading checkpoints other than the pretrained ones and wanted to understand what the intended behavior was. For example, loading and saving a B32 pretrained checkpoint…
-
# Prerequisites
Please answer the following questions for yourself before submitting an issue.
- [x ] I am running the latest code. Development is very rapid so there are no tagged versions as o…
-
I installed Comfy UI, open it, load default Workflow, load a XL Model, then Start, then this warning appears.
It reduces my generation speed by tenfold.
got prompt
model_type EPS
adm 2816
Using…
-
I am attempting to evaluate your model's performance on a brand new dataset by running the run_umt_pretraining.py script. I have only modified the datapath and batch_size parameters to match my datase…
-
![image](https://github.com/user-attachments/assets/519d1a16-a5ac-4902-9e14-f860daaf51e1)
SD1.5 workflow bug
-
The code shows it loads the visual encoder from a CLIP model (clip-vit-b16.pth). I did not find anything mentioned where it comes from. I tried to load clip-vitb16 from OpenAI huggingface, but it has …