-
I am creating an image for Jetson Orin Nano using Yocto+Openembedded-core build tools. I am following this link ( https://github.com/VC-MIPI-modules/vc_mipi_nvidia#integrate-the-driver-in-your-own-bsp…
-
The default for Jpegli is to output YUV as 4:4:4 regardless of quality level. However, according to exiftool, XYB outputs as 4:2:0 (blue-channel subsampled) as the default setting.
This can be conf…
-
I am just settling on WebP transcoding for my entire picture library to net a significant filesize savings while minimizing any perceptual loss. I've been trying to hack together some ash script to ha…
-
Hey all!
I have a Camera library that can do preview, photo capture, video capture, and frame processing at the same time. On iOS, this works perfectly. But on Android, it actually seems to be impo…
-
I'm developing SW library and using **libvmaf** successfully for vmaf metric calculation.
Unlike **vmaf** tool application, in my case I request a calculated vmaf metric value for frame N after submi…
-
Ho aperto questa issue per curare la parte di connessione client server.
L'idea è quella di avere un server (la fpga) che abbia una socket sulla quale resta in ascolto per le richieste di connessione…
-
I am trying to convert png images into JP2 with sampling factor. I am imagemagick tool for conversion, as they mentioned in docs -sampling-factor is used for sampling but it did not work.
http://ft…
-
I'm seeing worse quality from vc2hqencode than from the vc-2 reference encoder, even at `--speed=slowest`:
```
./vc2-reference/tools/convert_to_16p2 crowd_run_frame_1.yuv
./vc2-reference/src/En…
-
```
Currently its a 2 step process to scale and convert.
Implement a YUV to ARGB scaling function.
The primary use case is rendering related, so it would focus on
1. upsampling. but still work on do…
-
```
Currently its a 2 step process to scale and convert.
Implement a YUV to ARGB scaling function.
The primary use case is rendering related, so it would focus on
1. upsampling. but still work on do…