I am working with azure blob storage and usually with large sized scanned pdf documents.
I can reduce the size of the documents with GhostscriptProcessor, but first I need to save the document to a physical folder.
Because I'm working with blob storage and I have Stream.
Can I reduce the size of the documents with stream?
Please hep
// "buffer" is file stream that comming from api post
var ms = new MemoryStream();
buffer.CopyTo(ms);
// save file to temp physical folder
File.WriteAllBytes(inputFile, ms.ToArray());
ms.Close();
GhostscriptVersionInfo gvi = new GhostscriptVersionInfo(libPath);
GhostscriptProcessor proc = new GhostscriptProcessor(gvi);
List<string> switches = new List<string>();
switches.Add("-empty");
switches.Add("-sDEVICE=pdfwrite");
switches.Add("-dCompatibilityLevel=1.4");
switches.Add("-dPDFSETTINGS=/ebook");
switches.Add("-dNOPAUSE");
switches.Add("-dQUIET");
switches.Add("-dBATCH");
switches.Add(@"-sOutputFile=" + outputFile);
switches.Add(@"-f");
switches.Add(inputFile);
// reduce pdf size and save output to temp physical folder
proc.Process(switches.ToArray());
// read output file from temp physical folder
var newStream=File.OpenRead(outputFile);
...
clear temp physical folder and save output file in blob storage...
Hello,
I am working with azure blob storage and usually with large sized scanned pdf documents. I can reduce the size of the documents with GhostscriptProcessor, but first I need to save the document to a physical folder. Because I'm working with blob storage and I have Stream. Can I reduce the size of the documents with stream? Please hep