Open forgotpw1 opened 12 years ago
I'm still experiencing this. My guess is there is some limit getting hit. I was able to temporarily avoid it by not using s3 authentication, but with a big set (700+ inputs) it exploded again. I think all the characters in the keys and signatures are giving the input
object more than it can handle.
This is pretty much a show stopper and makes it really hard to use this in production.
I'm optimistic though that someone out there knows what's going on here.
What is the overall lifecycle of input
?
Is input
really a JSON object?
Is there a memory limit on thin or in ruby for JSON objects? Could another JSON library solve this.
Stuff works great when the input size is small, but unfortunately large sets aren't working and it's really hard to pin down what makes this happen. It's like the server instance can't handle the size of the input array.
Any insight into this would be helpful.
I have hit a problem with a big set of
inputs
(500+). I am building a file compression box. An action to zip a set of inputsWhen I try and merge the
input
it appears that upstream, in the process, theinputs
have "blown up. " Process is just saving each file to S3, and should return the new path to the inputs Array.Specifically, in the merge
inputs
should be anArray
, but instead it is coming though as aString
.Error Message looks like this
Anyone else ever hit this?
I thought this Could this be due to a text field on the database filling up with many characters. I switched this to a
longtext
field but that didn't do the trick.Is there some other memory issue with filling an array?
Here's my action. It is erroring on the block with
inputs.each