Open wgong opened 1 month ago
I am using NVIDIA GeForce RTX 4060
with 8.1GB vRAM,
after reducing image resolution and batch-size, I am able to run inference on those example images.
Going to submit PR later
It's very vram consuming, peak vram usage was about 35GB just running the example images, I could run it just because the shared vram was large enough.
It's very vram consuming, peak vram usage was about 35GB just running the example images, I could run it just because the shared vram was large enough.
Are you using Win's shared vram?thank you
running
scripts/inference.py
throws the following errorPlease update README with minimum required GPT vRAM,