Vaibhavs10 / insanely-fast-whisper

Apache License 2.0
7.5k stars 529 forks source link

torch_dtype only for torch.float16? #178

Open yumianhuli1 opened 8 months ago

yumianhuli1 commented 8 months ago

Does inference currently only support torch_dtype=torch.float16?int8_float16、int8 will be supported?