diegomura / react-pdf

📄 Create PDF files using React
https://react-pdf.org
MIT License
14.64k stars 1.16k forks source link

Optimization : compress images when render the pdf #1444

Open matryxxx02 opened 3 years ago

matryxxx02 commented 3 years ago

I have an issue with my rendering because I want to generate the big pdf with a lot of high-resolution images which are downscaled in CSS.

The problem is the render is too long (for a pdf of 50 pages, each page has 8 images) and the size of the pdf is too large (58Mo). When I remove images the size is 288Ko. I use react-pdf with Express API.

It would be nice to optimize the process by compressing the images.

Before I use html5-to-pdf and I didn't have to worry about the size of the images because it handled it by itself.

gwalshington commented 2 years ago

I second this.

The images should have an option to be compressed. The PDF created from react-pdf/renderer - there should be an option to compress and optimize the PDF as well.

ghost commented 2 years ago

Yeah with big image files PDF file get bigger too.

airidasj commented 1 year ago

Waiting for this.

airidasj commented 1 year ago

Also, I dont know why - but the Image component does not support WEBP images. We have compressed webp images ready to be used, however the Image component is blank after we input them.

ghost commented 1 year ago

Currently Image component only support jpg and png encoding only as I remember try converting your WebP to base64 and see. Maybe it's a limitation on the underlaying PDFKit. @airidasj

asiraky commented 1 year ago

Or you could do the optimization yourself with very very little effort. I had pdf's that were 5-10mb due to images, and all i did was sign up to a free account with cloudimg.io then replaced all my image tags with src={https://uername.cloudimg.io/${original_source}?w=800}. What this does is optimizes the image on the fly by limiting the width to 800 (you could use a lower value if you know your image is narrower than 800. My pdfs went down by around 500% doing this and it was a 1 liner.

airidasj commented 1 year ago

I ended up, making a big size PDF overall, then using GhostScript to optimize it.

I make a 20-30MB size pdf, then it is converted to a 150kb size with nearly no reduction in quality.

Ghoscript can only be used as Terminal command (outside of node).

So I generated PDF files in the backend, saved them to disk with FS. Then the ones saved to disc, were converted using this command:

  var inputFile = 'temp-pdf/' + Date.now() + 'input' + '.pdf'
  var outputFilePath = 'temp-pdf/' + Date.now() + 'output' + '.pdf'

 exec(
        `gs  -q -dNOPAUSE -dBATCH -dSAFER  -sDEVICE=pdfwrite  -dCompatibilityLevel=1.4  -dPDFSETTINGS=/ebook  -dEmbedAllFonts=true  -dSubsetFonts=true  -dAutoRotatePages=/None  -dColorImageDownsampleType=/Bicubic  -dColorImageResolution=100  -dGrayImageDownsampleType=/Bicubic  -dGrayImageResolution=100  -dMonoImageDownsampleType=/Subsample  -dMonoImageResolution=100  -sOutputFile=${outputFilePath}  ${inputFile}`,
        async (err, stdout, stderr) => {
          if (err) {
            console.log('Compression (Commercial offer PDF) error')
            console.log('err :>> ', err)
            reject(err)
          }

          console.log('After resize executable commant')
          // console.log('outputFilePath :>> ', outputFilePath)
          const newStream = await fs.createReadStream(outputFilePath)

          console.log('After create Read stream...')

          // delete the temporary files
          fs.unlinkSync(inputFile)
          fs.unlinkSync(outputFilePath)

          // Resized pdf...
          resolve(newStream)
        },
      )

Also need to install ghostscript:

sudo apt update
sudo apt install ghostscript

for MAC ---> brew install ghostscript
marfb commented 1 year ago

Bumping this @airidasj solution do not fit my project needs.

krzysztof-stanislawski commented 9 months ago

Bump, also waiting for at least node.js solution