colinmollenhour / Cm_Cache_Backend_Redis

A Zend_Cache backend for Redis with full support for tags (works great with Magento)
Other
389 stars 142 forks source link

Too many values to unpack when using lua #88

Closed darkopetreski closed 7 years ago

darkopetreski commented 9 years ago

When I use lua, in a cron-script that processes a huge volume of data, I got fatal error "Too many values to unpack". If I set use_lua=false it works but it is much slower.

darkopetreski commented 9 years ago

Here is the exact error:

PHP Fatal error: Uncaught exception 'CredisException' with message 'Error running script (call to f_1617c9fb2bda7d790bb1aaa320c1099d81825e64): user_script:1: too many results to unpack

colinmollenhour commented 9 years ago

Try adjusting lua_max_c_stack in your config options to say 1000. No idea why the default 5000 would not work for you, though..

darkopetreski commented 9 years ago

Excellent! Sorry that I missed this option in the first place.

Thanks.

darkopetreski commented 9 years ago

lua_max_c_stack does not affect the save function (only _removeByMatchingAnyTags). For huge nightly scripts that store a value with huge amount of tags, on save I get the same error. I will disable lua for this case, but it is good if lua_max_c_stack also affects save

colinmollenhour commented 9 years ago

Ahh, the issue is on the save function. Details like that are good to have. :)

I personally would fix the application to not use so many bloody tags, but the other option would be to update the lua code in the save function to use the _luaMaxCStack property similarly to the _removeByMatchingAnyTags method.