Closed tlvenn closed 6 years ago
@edgurgel any though ? Thanks in advance !
cjson
strikes again?
127.0.0.1:6379> EVAL 'return cjson.encode(cjson.decode(\'{"args":[{"activity_id":393087288629462191}]}\'))' 0
"{\"args\":[{\"activity_id\":3.9308728862946e+17}]}"
Comparison with Poison:
iex(13)> Poison.decode!(Poison.encode!(%{"activity_id" => 393087288629462191}))
%{"activity_id" => 393087288629462191}
-.- @krasio haha cjson
is basically the reason of all problems :)
Yeah it's because this is not an "integer" anymore. It's a "Big Number" that cjson won't support.
For numbers that are larger than 32 bits you will need to use a string I think and then convert
More here: https://github.com/DaveGamble/cJSON/issues/162
I'm not sure if we can do much apart from documenting that we should avoid numbers greater than 2^32
Ha thanks for pointing out cJson, I did not realise that the LUA scripts needed to decode the job to move it around on redis side.
The thing is the LUA side does not care at all about the job args so it seems we only need to wrap it in a way that it will be treated as a string as far as cJson is concerned but on elixir side when we decode the job, we could unwrap it.
Simply doing a 2 pass Json encoding/decoding would do the trick.
What do you think ?
Ok so the reason why we need to encode again is for this:
https://github.com/edgurgel/verk/blob/master/priv/enqueue_retriable_job.lua#L7-L12
We had enqueued_at
to the JSON.
A really really nasty solution?
Add "manually" to the original json the key "enqueued_at" : ...
Then we won't suffer of cjson encoding rules?
Ya i saw that and it's fine, my point is as far as Redis / LUA is concerned, the args can be opaque, it never needs them. So we can protect them from cJson by simply encoding them as a json string using Poison meaning that in Redis, the args is a Json encoded string.
We simply need to have a custom decoder / encoder with Poison. Might be a good opportunity to switch to Jason as well.
That's the problem. It needs them.
It needs to know which queue will be moved to and it needs to populate the enqueued_at
. We can't remove from the sorted set before we add to the queue so we don't have jobs out of Redis at any point in time.
Or do you mean double encoding just args
?
Now I understood your point! :)
Yeah I'm completely fine doing this approach + using Jason if it makes sense 👍
Ok great, let me submit a PR then
The only downside that I can think of is that it won't have the property of accepting "Sidekiq/Resque" clients sending jobs to Verk. They would need to double encode args before enqueuing the job 🤔
Hmm I did not think about that.. We should be able to detect this easily enough, the args should never be a string to begin with so I can unwrap when I detect one (a job being enqueued by Verk) or otherwise bypass this step (a job enqueued by Sidekiq/Resque).
I have pushed the commit to take care of that and fixed the 2 credo issues.
Hi,
It seems like integer are not properly decoded for some reason. For example the following:
is failing with:
2 things to notice from the errror and stacktrace:
3.9308728862946e17
argument error
being emitted by:erlang.binary_to_integer(393087288629462191)
. I dont understand why and how a casting from String to Integer is happening.Any idea what is going on ?