Closed ali7569392725 closed 3 weeks ago
Can you try this demo and see if it generates output correctly?
Demo is working fine but while I'm running the inference script , output is not properly generating ,Following is the command and modified my output & input paths in config.
"python inference_onnx.py --onnx_path lightning_logs/version_0/checkpoints/frn.onnx"
Hi, sorry for the late reply.
Probably the onnx model in the repo is different from one in the HF space. Can you try using the one in HF space? https://huggingface.co/spaces/anhnv125/FRN/blob/main/lightning_logs/version_0/checkpoints/frn.onnx
Tried Inferencing the already existing model with 48KHz sample audio lossy file , the output length is expected to match with input length , but output was generating 1KB