negge / jpeg_gpu

GPU accelerated JPEG decoder
Apache License 2.0
129 stars 14 forks source link

Encoder? #4

Open Zubnix opened 6 years ago

Zubnix commented 6 years ago

This is some pretty cool work. Being able to decode jpeg without reverting to vendor specific gpu libraries is really awesome.

Do you plan or happen to know about an OpenGL accelerated jpeg encoder library as well?

negge commented 6 years ago

Thanks for the interest.

This library currently does not support JPEG encoding, but it is something I have thought of. The big problem is that the entropy coding step is still a serial process. However there are things you could do to parallelize the DCT stage.

GPU based JPEG encoding would be particularly useful if you were already starting from source that was on the card, e.g., video game capture or screen sharing. Do you have a use case you are interested in?

Zubnix commented 6 years ago

I'm exploring a gpu accelerated application live stream from a server to the browser ( https://github.com/udevbe/greenfield ). The current solution uses H264 encoded & decoded on the CPU, while using the GPU for color conversion. This works fine if the server & client are fast, however on a reasonable old (~4-5 year) server CPU, the encoding step can take up to ~100ms for a single 1920x1080 frame which is unacceptably slow. Ideally a OpenGL acceleratead H264 encoder/decoder would be optimal, however those seem to be even rarer than their jpeg counterpart.

The reason it has to be OpenGL is because it has to be able to run on multiple VMs on the same host machine, which these days is possible using Virtio GPU/Virgil 3D. Using solutions like CUDA or VAAPI is not possible in this situation. For the decoding step, the constraint is WebGL as other accelerated options are simply not available. The decoding part of this library should be able to work fine in the browser using WebAssembly.

As for the source image. In my case applications (source images) can be on both the GPU or in memory. In most cases however the image will be on the GPU (as an egl image/texture).

This is basically in a nutshell why I am really hoping for a possible OpenGL accelerated encoder.