I've been using your glsl shaders with MPV which work great, and I'm now trying to use your ONNX models with VapourSynth, which is all new for me.
I setup VapourSynth with the vs-mlrt plugins as recommended in your readme, which all seem to work properly. I'm testing with a .vpy file where I'm trying to call vsmlrt.ArtCNN. I see that it expects 16/32 bit float and a GRAY color family. How would I convert the input source properly?
I tried to lookup guides/examples but the ones I found didn't need that conversion and just used an RGB image as input.
So in my mpv.conf:
# Based on an example, not sure whether buffered-frames and concurrent-frames are needed or which values to use
vf=vapoursynth="~~/test.vpy":buffered-frames=1:concurrent-frames=99
And the test.vpy Python script:
import vapoursynth as vs
from vapoursynth import core
from vsmlrt import ArtCNN, ArtCNNModel, Backend
clip = video_in
# Something here to convert to gray color family?
clip = ArtCNN(clip, model=ArtCNNModel.ArtCNN_C16F64, backend=Backend.TRT(fp16=True))
# Something here to combine gray output with the input's colors?
clip.set_output()
Thanks in advance! Sorry if it's a bit OT, but hopefully it could prove useful for other people trying to use the ONNX models.
I've been using your glsl shaders with MPV which work great, and I'm now trying to use your ONNX models with VapourSynth, which is all new for me.
I setup VapourSynth with the vs-mlrt plugins as recommended in your readme, which all seem to work properly. I'm testing with a .vpy file where I'm trying to call vsmlrt.ArtCNN. I see that it expects 16/32 bit float and a GRAY color family. How would I convert the input source properly? I tried to lookup guides/examples but the ones I found didn't need that conversion and just used an RGB image as input.
So in my
mpv.conf
:And the
test.vpy
Python script:Thanks in advance! Sorry if it's a bit OT, but hopefully it could prove useful for other people trying to use the ONNX models.