black-forest-labs / flux

Official inference repo for FLUX.1 models
Apache License 2.0
13.39k stars 942 forks source link

List supported GPU's on the readme? #67

Open HubKing opened 1 month ago

HubKing commented 1 month ago

I think you need to list what GPU's are supported.

duracell80 commented 1 month ago

Currently having success generating desktop wallpapers after prompt enhancement with ollama-llama3 at 1920x1080 with schnell barely using 1GB VRAM with diffusers.

Sequential CPU offload, avg speed=140 seconds@5 steps. Max GPU pull 35 Watts.

GPU: NVIDIA GeForce RTX 3050 6 GB. CUDA Cores 2304, TMUs 72, ROPs, 32. CPU: i9-13000H 40GB RAM.

HubKing commented 1 month ago

Currently having success generating desktop wallpapers after prompt enhancement with ollama-llama3 at 1920x1080 with schnell barely using 1GB VRAM with diffusers.

Sequential CPU offload, avg speed=140 seconds@5 steps. Max GPU pull 35 Watts.

GPU: NVIDIA GeForce RTX 3050 6 GB. CUDA Cores 2304, TMUs 72, ROPs, 32. CPU: i9-13000H 40GB RAM.

Well, you're using Nvidia RTX, which means that you're basically guaranteed to be able to run any A.I. stuff. But if you were using Radeon or Arc, you would be afraid to run A.I. stuff, because more than likely the thing wouldn't work (or fall back to CPU), after downloading tens of gigabytes of stuff. So, the official requirements in the readme could save time for these people.

SwanVods commented 15 hours ago

I'm using RX 6800 16GB and i don't see it is accessing my GPU. All the process consumes my 32GB RAM at 100% . Is there any config related to this?