Closed sagarcasm closed 1 year ago
@sagarcasm You can prepare the insert statement once. Then do bind and execute for every row being inserted as being done by for loop in test file https://github.com/ibmdb/node-ibm_db/blob/master/test/bench-prepare-bind-execute-closeSync.js Alternatively, you can create an array of column data of size 20 or 50 and do an array insert as shown in test file https://github.com/ibmdb/node-ibm_db/blob/master/test/test-array-insert.js . In your shared program, you are doing prepare, execute and fetch for every row. Which is inefficient for insert. If you want to use your current program then preparesync should be called only once and just run executesync with data for every row. then close the result and stmt. no need to call fetch. Thanks.
@sagarcasm Hope you are able to make progress. Let us know otherwise. Thanks.
The code example in this issue shows an array insert https://github.com/ibmdb/node-ibm_db/issues/883. The technique shown works perfectly for all data types except the data types shown, eg the combination of BINARY, VARBINARY, CHAR FOR BIT and BLOB data types and a Javascript Buffer.
ibm_db@3.1.0 has fixed this issue. Thanks.
Hi team,
Could you help me to understand what is the best way to do to bulk insert on the db2 database. We are using the below piece of code for inserts, but it takes horrendous amount of time even to load 500 records which is not efficient in our case.
Could you us suggest a vey efficient way of doing a batch inserts ?