Open dbpolito opened 7 years ago
I'm on mobile now so can't give this my full attention, but to debug i'd try it with smaller files to see if you can get any success - it's been quite a while since i've used this, so forgive me if the edges are a little rough. On Mon, Jul 10, 2017 at 5:30 PM Daniel Polito notifications@github.com wrote:
Hey, great stuff you have here... congrats...
Sorry to bother you, i'm trying to do basically the same thing you do here, but i'm just facing a problem, Size Limit and Job Timeout.
It seems you have a 500mb limit and a 5 minute timeout. Did you have a problem with these limits? Or your use case you only handle small files?
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/russmatney/lambda-gif-to-mp4/issues/5, or mute the thread https://github.com/notifications/unsubscribe-auth/ABhbvvpVKpYL6WJKQ1z5HfEGkV6V-uaZks5sMpf2gaJpZM4OTcWF .
Thanks for your reply. Yes, it does work great with small files, but i will need to work with big files, + 1GB
and +5 Minutes
jobs... I was wondering if there is any way to bypass these limits, only noticed them now 😢
It's been so long since I've been in here, and my preliminary digging left me laughing at whatever this commit was doing. So this perhaps isn't the most robust path... but that's not very helpful.
I remember the max time for lambda functions when i was working on this was 60s; i'm not sure if that's been extended to 5 min or if it's configurable, but I assume that's where you're seeing an issue. If you're sure the timeout is what's killing your function, I'd look there to see if you can opt into a longer one. Alternatively, I'd consider whether or not it should be taking a full 5 minutes to run the process on your local machine – that seems like a long time! But it's certainly feasible for processing video.
If all that passes sanity checking, and you still want to use lambda, you may need to break the process across multiple lambda function calls, and use some other mechanism for orchestrating a string of longer-running functions. I did this once, and while it was hacky, it did work. The code for that is here, but perhaps better explained in this readme or this blog post.
Either way, I'd also want to know if you can even download your files fast enough to start processing it – if aws doesn't let the file in at all, that seems like a non-starter. Maybe you can compress them? You should be able to write a quick function with logging to confirm that the files are at least being downloaded to the aws machine.
And after all that, something like apex might be getting more maintenance attention these days, so you should at least have a look.
@dbpolito wouldnt a 5 minute+ lambda task be crazy expensive if you ran it lets say thousands of times? With >1Gb and 5+ minute times you might as well use something like node-ffmpeg
and just provision an EC2 instance
Hey, great stuff you have here... congrats...
Sorry to bother you, i'm trying to do basically the same thing you do here, but i'm just facing a problem, Size Limit and Job Timeout.
It seems you have a
500mb
limit and a5 minute
timeout. Did you have a problem with these limits? Or your use case you only handle small files?