Open salahzoubi opened 1 year ago
Hello, I'm wondering if there's ways to further optimize the inference speed using your repo?
Was wondering if you're repo scales with more GPUs/cpus?
Is there a way to make resolution 128x128 for inference? That would certainly reduce inference time too right?
Much appreciated for all the help!
Hello, I'm wondering if there's ways to further optimize the inference speed using your repo?
Was wondering if you're repo scales with more GPUs/cpus?
Is there a way to make resolution 128x128 for inference? That would certainly reduce inference time too right?
Much appreciated for all the help!