Closed abmusse closed 3 years ago
For SQL_PARAM_INPUT_OUTPUT
case I think we should set the indicator to the string length for both clobs and strings here too.
https://github.com/IBM/nodejs-idb-connector/blob/master/src/db2ia/dbstmt.cc#L2610-L2618
else if (param[i].io == SQL_PARAM_INPUT_OUTPUT)
{
param[i].buf = (char *)calloc(param[i].paramSize + 1, sizeof(char));
strncpy((char *)param[i].buf, cString, str_length);
if (bindIndicator == 0) //CLOB
param[i].ind = str_length;
else if (bindIndicator == 1) //NTS
param[i].ind = SQL_NTS;
}
diff --git a/src/db2ia/dbstmt.cc b/src/db2ia/dbstmt.cc
--- a/src/db2ia/dbstmt.cc
+++ b/src/db2ia/dbstmt.cc
@@ -2593,7 +2593,7 @@ int DbStmt::bindParams(Napi::Env env, Napi::Array *params, std::string &error)
const char *cString = string.c_str();
param[i].valueType = SQL_C_CHAR;
- if (param[i].io == SQL_PARAM_INPUT)
+ if (param[i].io == SQL_PARAM_INPUT || param[i].io == SQL_PARAM_INPUT_OUTPUT)
{
param[i].buf = strdup(cString);
param[i].ind = str_length;
@@ -2603,15 +2603,6 @@ int DbStmt::bindParams(Napi::Env env, Napi::Array *params, std::string &error)
param[i].buf = (char *)calloc(param[i].paramSize + 1, sizeof(char));
param[i].ind = param[i].paramSize;
}
- else if (param[i].io == SQL_PARAM_INPUT_OUTPUT)
- {
- param[i].buf = (char *)calloc(param[i].paramSize + 1, sizeof(char));
- strncpy((char *)param[i].buf, cString, str_length);
- if (bindIndicator == 0) //CLOB
- param[i].ind = str_length;
- else if (bindIndicator == 1) //NTS
- param[i].ind = SQL_NTS;
- }
}
else if (bindIndicator == 2)
{ //Parameter is Integer
Error originates in bindParams() where the indicator/strlen was being set to SQL_NTS instead of the length of input. Therefore the entire string was not being passed causing malformed UTF-8 data.
This is still a WIP given that areas other than if the parameter io is
SQL_PARAM_INPUT
may also need to get fixed too.Fixes #129