Open Inferencer opened 4 days ago
Great work, couple of questions which might save a bunch of issues opening up
- ETA on code release?
- model released will be 256 or 512?
- inference time on GPU (please state tested GPU & driving vid frame rate & duration)
- lowest vram required? (if tested, if not please ignore and await user inputs)
+1
Great work, couple of questions which might save a bunch of issues opening up
ETA on code release?
model released will be 256 or 512?
inference time on GPU (please state tested GPU & driving vid frame rate & duration)
lowest vram required? (if tested, if not please ignore and await user inputs)