Closed chafey closed 4 years ago
Hi Chris,
You question arrived at the right time.
I am happy to say that it compiles to javascript/wasm. I have written a small wrapper for the interface between javascript and C++. I have also created a simple demo webpage, which can be accessed here.
I pushed the new code changes to the development branch. I will push it to master before next week.
To access it, you need to
git pull https://github.com/aous72/OpenJPH development
To compile it, you need the emscripten tools. After using
emsdk_env.sh
from the emscripten tools, do
cd subprojects/js/build
emmake cmake ..
make
This will create libopenjph.js and libopenjph.wasm in subprojects/js/html folder. Modify the CMakeLists.txt file as desired.
Cheers, Aous.
Yes very good timing, thanks for the update! I am curious how this will work for medical images which are typically 16 bit grayscale - is that supported by this library yet? If not, is it planned? Is partial bitstream decoding supported yet (or planned)? It would be nice for example to render a 256x256 lower resolution of a 512x512 image with downloading just the part of the bitstream needed for the 256x256 lower resolution (I know this is possible in J2K and assume it is with HTJ2K too). Thanks
The library supports 16 bit unsigned grayscale images; in fact the library itself supports more than 16 bits, but currently ojph_compress and ojph_expand are limited to 16 bits. After your question, I tested with a 16bit image, and I discovered a bug in the quantizer. This has been fixed in the development branch, which should be merged with the master before next week. Supporting 16bit in a web-browser is not doable at the moment; no browser I am aware of can display HDR images.
Resolution scalability, whereby you can extract a 256x256 image out of 512x512 compressed image, is high on the list. However, supporting this on a webpage requires the support of the webserver; downloading a complete image and then only displaying a low resolution version of it is a waste of bandwidth. What we need is a server that can truncate the image file to the desired resolution; we do not want a full JPIP server here, only a simplified version of it. This truncated image would then be decoded and displayed. The updated white paper, which is not available yet, discusses this in Section 2.6.
Cheers, Aous.
Does it support 16 bit signed or just unsigned? While browsers cannot display 16 bit directly, medical imaging applications usually apply a transform (DICOM VOI LUT transform) to map a range of the 16 bit data into 8 bits for display. If the js version could produce a 16 bit image, I can integrate it with cornerstone (https://github.com/cornerstonejs/cornerstone)
In terms of resolution scalability - JPIP is one protocol but there are others (including a custom one we are using). What I am looking for is sdk support for partial decode so I can integrate this with my protocol. In this case, I would link the C++ version of your library into the server side and use the js on the client side for decoding in web browser
The library supports both signed and unsigned 16bit images (supports more than 16 bits).
ojph_compress and ojph_expand accepts only yuv and ppm images. You may need to modify these files in order to feed in or extract images which can not be represented in these formats; you may need to also change ojph_img_io.cpp, which handles the reading and writing.
Give yuv format a test to see if it meets your needs; up to 16bits should be supported (let me know how it goes). I understand there is a need to feed in raw files, but this task is low on my list.
The js script was a first attempt. I will modify it to make expose higher bit depths, and the "signness" of the data before merging it with master. I need to modify the associated script in the html file, as well. Ultimately, you have access to the code of the toy examples ojph_wrapper.cpp and index.html, which you can modify to suit your needs.
It is useful to know how end users intend to use the code.
For your image, try ../bin/ojph_compress -i gray_s16.yuv -o test.j2c -num_decomps 5 -dims {width,height} -num_comps 1 -signed true -bit_depth 16 -downsamp {1,1} -reversible true then expand it using ../bin/ojph_expand -i test.j2c -o test_out.yuv
so if that works for you.
Cheers, Aous.
Good news - I forked the repo and made some changes and can now convert DICOM Medical Images to J2C! The changes are the following: 1) Added mem_outfile so I can compress into memory buffer 2) Added support for RAW images as input (class raw_in : public image_in_base) 3) Added function to ojph_wrapper.cpp so JS can encode a raw file to j2c
I dumped some DICOM images to RAW format and used the above to convert to J2C.
I am happy to submit a PR for this - let me know if this is OK
Here is some data on the image above: Resolution: 512x512 Bit Depth: 12 Uncompressed Size: 524,228 bytes Compressed Size: 204,253 bytes Encoding Time: 7ms (C++ version running in docker container) Decoding Time: 14 ms (js version on chrome 78) Hardware: 2019 13" MBP 2.8 GHz (Quad Core i7)
The JPEG2000 version of this same image compresses to 189,256 bytes and takes 71ms to decompress with a JS build of OpenJPEG 2.1.1 (NOTE that this is an old version - the latest version is supposed to be much faster)
The JPEG-LS version of this same image compresses to 194,566 bytes and takes 13 ms to decode with a JS build of CharLS.
This is exciting. These results are nice -- nicer than the ones I got.
Please put a pull request and I will make the most of it. I think
Thank you Chris.
See https://github.com/aous72/OpenJPH/pull/5 for mem_outfile, raw_in and also truncation point capturing during encoding functionality
I should have written this earlier. Thank you for adding this pull request. I will add the functionality gradually.
I just read the HTJ2K white paper which references a possible javascript decoder based on this work. Can you provide any more information about this?