ibmdb / node-ibm_db

IBM DB2 and IBM Informix bindings for node
MIT License
188 stars 151 forks source link

Memory Usage with Array Insert, Silent termination of Node when too much memory used #851

Closed markddrake closed 2 years ago

markddrake commented 2 years ago

I have what appears to be a memory related issue with Array Insert operations that causes Node / IBMDB2 to fail silently.

I am inserting batches of XML documents. I do not track the size of the docuemnts, so I set Length to 16M. If I try to insert a batch of 10000 rows my node process simply stops, no errors are reported. If I decrease the size of the batch then when I get to a batch size of 250 everything works as expected.

Are there any guidelines available as too how much memory can be used per batch ?

Is there anyway of configuring the amount of memory avaiable, eg using using nodes settings ( I suspect the issue is in the ODBC layer, and I'm not sure how that uses node memory if at all)

Shouldn't an error be generated in this scenario

bimalkjha commented 2 years ago

@markddrake The length is used to allocate memory buffer by the ibm_db driver. The capability of driver to allocate memory depends on the capability of the system and available memory by the OS for ibm_db. If use of large amount of data causes memory issue, use the lesser value. In your case, for batch of 250 everything works fine, good to use this size only instead of larger size. There is no specific guideline or way to configure the available memory. It is not issue with ODBC layer but ibm_db layer need to allocate a row of memory of arraySize. Then pointer to such row of memory is passed to ODBC layer. Thanks.