tango-controls-hdbpp / libhdbpp-mysql

Library for HDB++ implementing MySQL schema. Moved to https://gitlab.com/tango-controls/hdbpp/libhdbpp-mysql
http://www.tango-controls.org/community/projects/hdbplus
1 stars 4 forks source link

"Prepared statement contains too many placeholders" on insertion of images bigger than 5461 pixels #5

Closed prokofyevDmitry closed 6 years ago

prokofyevDmitry commented 6 years ago

When using the hdb-++es-srv to archive images (251x251)(for example double_image of TangoTest DS) on insertion, during the preparation of the query the error "Prepared statement contains too many placeholders" is encountred by the Mysql server

Expected Behavior

When archiving a image, the method HdbPPMySQL::store_array (LibHdb++MySQL.cpp, line:1624) should insert the array without error whatever is the dimension of the image.

Current Behavior

The method HdbPPMySQL::store_array return "Prepared statement contains too many placeholders" error when the number of placeholders is superior to 65 536. In the current code, there are 12 palceholders per point. Which limit the archiving of images with up to 5461 pixels which is very limiting.

Possible Solution

The simplest solution would be to chunk the INSERT request in as many insertions containing 65 536 placeholders (or 5461 points) as needed in HdbPPMySQL::store_array method

The other solution would be to print the complete, allready prepared request into a tmp file and use LOAD DATA INFILE to insert it.

gscalam commented 6 years ago

Thanks for the report. I'll try to chunk INSERT requests. I am also working on a release to store arrays in a JSON field of a single row.

prokofyevDmitry commented 6 years ago

Cool thx very much 👍

prokofyevDmitry commented 6 years ago

Hi, I just wanted to know how much time it'll take you to fix this issue. Just so I can plan my project that might rely on this feature. Thx in advance.

gscalam commented 6 years ago

Hi, committing today the chunked insert. Modification to have single json row instead is almost ready but leaking documentation.

gscalam commented 6 years ago

Hi, I committed also the JSON implementation. I tested it just with small arrays but as for the MySQL documentation the only limit in size should be related with the max_allowed_packet system variable. Let me know if you have any issue.