Open breakdom opened 4 years ago
Hmm, I'm not sure, but how do you know it's memory?
For debugging purpose, could you resize your source svg before exporting, and then do smaller scaling when exporting it? For example 2x before exporting and then 5x while exporting.
Hmm, I'm not sure, but how do you know it's memory?
I'm not sure it's memory, at the moment it's a hypothesis
For debugging purpose, could you resize your source svg before exporting, and then do smaller scaling when exporting it? For example 2x before exporting and then 5x while exporting.
ok I can try exporting the svg from illustrator to 2x and then convert to 5x with the svgexport command, but in case how can I go to scale a svg programmatically to 2x? thank you
A test with a 7MB file was successful scaling up to 10x while starting from the original file of 27MB at the maximum scale at 6X. This makes me think of a limit of available memory because on a server with 8GB of RAM it worked while on this one that has 64GB it doesn't work.
I found a difference between server, the first if i launch te command "svgexport 01.svg 01.jpg 10x" convert the file without a problem. In the second server the same command give me this :
(node:2754) UnhandledPromiseRejectionWarning: Error: Failed to launch the browser process! [0301/185023.253613:ERROR:zygote_host_impl_linux.cc(89)] Running as root without --no-sandbox is not supported. See https://crbug.com/638180.
TROUBLESHOOTING: https://github.com/puppeteer/puppeteer/blob/master/docs/troubleshooting.md
at onClose (/usr/local/lib/node_modules/svgexport/node_modules/puppeteer/lib/Launcher.js:750:14)
at Interface.<anonymous> (/usr/local/lib/node_modules/svgexport/node_modules/puppeteer/lib/Launcher.js:739:50)
at Interface.emit (events.js:333:22)
at Interface.close (readline.js:409:8)
at Socket.onend (readline.js:187:10)
at Socket.emit (events.js:333:22)
at endReadableNT (_stream_readable.js:1204:12)
at processTicksAndRejections (internal/process/task_queues.js:84:21)
(node:2754) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). To terminate the node process on unhandled promise rejection, use the CLI flag --unhandled-rejections=strict
(see https://nodejs.org/api/cli.html#cli_unhandled_rejections_mode). (rejection id: 1)
(node:2754) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
this problem only returns it on the current server, on two different servers it doesn't give it, how can I install a clean version of the program by deleting everything I have installed?
@breakdom Puppeteer/Chromium does not like to be run as root, if you switch to a different user it might start working. Installing the dependencies as @shakiba pointed out might also help. As this is enforced by Puppeteer/Chromium for security reason there is little we can do on our side (and I would rather keep the sandbox enabled, also disabling the sandbox did not work in my setup anyways, so I recommend to not run svgexport as root). As for the original problem "Scale error over 6x " this might also be some timeout problem, we will add a way to configure the timeout in the next version (which will be released hopefully soon).
I found this difference between the servers where it works and the one where it doesn't work. In the servers where it works during the conversion, the file is written on the file system and you can see the output file increase, on the server where the conversion does not work the conversion is in a memory buffer then after conversion writes to the file system and I think this buffer can't support a large file conversion.
Hi, I tested this library on a small server where I was doing svg conversions of about 28MB with 10X scales without problems, now I have switched to a new server and if I exceed 6x the conversion stops. Anyone know which parameter must be changed to give more memory to the process?