Closed patrickhrastnik closed 3 years ago
What's the CCSID of the CLOB?
Table object has CCSID 273 And according to catalog SYSCOLUMNS the definition for field LOMEJL has CCSID 273
Two Sql Insert statements are performed one right after the other.
When debugging / single-stepping, it looks to me like the first one (holog.jwtjtp, dds defined, no special columns) always succeeds, and the second one (holog.holojp, sql defined - see definition above) always crashes.
When I try to run the exact statement (insert into holog.holojp) with another sql tool (in my case DBeaver), it works just fine. But I gotta say, I have to copy-paste the parameter values from somewhere, e.g. VS Code debugging console - that could influence the value and change CCSIDs. So that's an unfair comparison.
@patrickhrastnik Did you manage to find a solution for this issue?
@patrickhrastnik Did you manage to find a solution for this issue?
No, not yet... I struggle to figure out what the issue is. It doesn't seem to have to do anything with codepages or special characters. But I didn't have much time to further investigate the issue, or to test different settings. What's putting me off is, when I deployed the feature for the next team to work with, everything worked as intended. A few weeks/months later the feature was broken - and I wasn't able to find out what changed in the meantime. Nevertheless, we decided last week to redevelop the broken feature using an ELK stack. That was the final goal after all. I'm just curious if this bug only occurs with CLOB fields, with certain sql defined tables, ... whether I have to expect other features to break.
@patrickhrastnik Did you manage to find a solution for this issue?
No, not yet... I struggle to figure out what the issue is. It doesn't seem to have to do anything with codepages or special characters. But I didn't have much time to further investigate the issue, or to test different settings. What's putting me off is, when I deployed the feature for the next team to work with, everything worked as intended. A few weeks/months later the feature was broken - and I wasn't able to find out what changed in the meantime. Nevertheless, we decided last week to redevelop the broken feature using an ELK stack. That was the final goal after all. I'm just curious if this bug only occurs with CLOB fields, with certain sql defined tables, ... whether I have to expect other features to break.
I'm expericing the same issue. We have a table as following
CREATE TABLE ORDERS ( ORDER_ID CHAR(25) DEFAULT ' ' NOT NULL, ERP_NAME CHAR(50) DEFAULT ' ' NOT NULL, );
In our case the error occur when we insert into that table with the following string in the column ERP_NAME
'DANF.MAGN.VENTIL 32U7115 TIL ÅBNE-/SANITÆRE ANLÆG'
I've tried to play around with the string by removing and replacing characters. Apperantly in my case it seems to accept the string if I change the 'Æ' in the word 'Anlæg' with 'A' or any other standard characters.
The strange thing as you also mentioned is that it works fine in DBeaver. I think this might be a bug with the idb-pconnector.
@commIT-Esbjerg
What's the CCSID for your CHAR FIELDS?
Are you also running into the same errors listed above?
Error: SQLSTATE=58004 SQLCODE=-901 ("UNSUCCESSFUL EXECUTION CAUSED BY A SYSTEM ERROR THAT DOES NOT PRECLUDE THE SUCCESSFUL EXECUTION OF SUBSEQUENT SQL STATEMENTS")
Error: SQLSTATE=22504 SQLCODE=-191 ("A STRING CANNOT BE USED BECAUSE IT IS INVALID MIXED DATA")
Here is the code I used to try and reproduce the error
const { Connection, Statement } = require('idb-pconnector');
async function main() {
const connection = new Connection({ url: '*LOCAL' });
connection.debug(true);
const statement = new Statement(connection);
const schema = 'TEST';
const schemaResult = await statement.exec(`SELECT SCHEMA_NAME FROM qsys2.sysschemas WHERE SCHEMA_NAME = '${schema}'`);
await statement.closeCursor();
if (!schemaResult.length) {
await statement.exec(`CREATE SCHEMA ${schema}`);
}
await statement.exec(`CREATE OR REPLACE TABLE ${schema}.ORDERS ( ORDER_ID CHAR(25) DEFAULT ' ' NOT NULL, ERP_NAME CHAR(50) DEFAULT ' ' NOT NULL);`);
await statement.exec(`INSERT INTO ${schema}.ORDERS(ERP_NAME) VALUES('DANF.MAGN.VENTIL 32U7115 TIL ÅBNE-/SANITÆRE ANLÆG') with NONE`);
const results = await statement.exec(`SELECT * FROM ${schema}.ORDERS`);
console.log(`Result Set:\n ${JSON.stringify(results)}`);
await statement.close();
await connection.close();
}
main().catch((error) => {
console.error(error);
});
Output:
SQLExecDirect(0): SELECT SCHEMA_NAME FROM qsys2.sysschemas WHERE SCHEMA_NAME = 'TEST'
SQLNUMRESULTSCOLS(0) Column Count = 1
DescCol(0) index[0] Type[VARCHAR] Scale[0] Precise[128]
SQLCloseCursor: stmth 3
SQLCloseCursor(0)
SQLExecDirect(1): CREATE OR REPLACE TABLE TEST.ORDERS ( ORDER_ID CHAR(25) DEFAULT ' ' NOT NULL, ERP_NAME CHAR(50) DEFAULT ' ' NOT NULL);
SQLExecDirect SUCCESS_WITH_INFO (7905) 01567
SQLNUMRESULTSCOLS(0) Column Count = 0
NO RESULTS: SQLExecDirect() callSQLExecDirect(0): INSERT INTO TEST.ORDERS(ERP_NAME) VALUES('DANF.MAGN.VENTIL 32U7115 TIL ÅBNE-/SANITÆRE ANLÆG') with NONE
SQLNUMRESULTSCOLS(0) Column Count = 0
NO RESULTS: SQLExecDirect() callSQLExecDirect(0): SELECT * FROM TEST.ORDERS
SQLNUMRESULTSCOLS(0) Column Count = 2
Result Set:
[{"SCHEMA_NAME":" ","":"D"}]
SQLFreeStmt: stmth 3 [SQL_DROP]
SQLFreeStmt(0)
SQLDisconnect: conn obj [1803aabf0] handler [2]
SQLFreeConnect: conn obj [1803aabf0] handler [2]
SQLFreeConnect[0]
On my end the CCSID is 37. The insert is successful but there is an issue fetching the data. As can be seen from the result set output:
Result Set:
[{"SCHEMA_NAME":" ","":"D"}]
We will need to further investigate how to resolve retrieving the data at the idb-connector
level
Although selecting the data from Run Sql Scripts shows the proper string:
@kadler Should users be using UTF-8 CCSID 1208 ?
@abmusse
I'm using CCSID 277 for the CHAR fields. I tried to change the CCSID to 1208 as you mentioned and the record was inserted succesfully, but it seems like the string changes to this when inserted:
DANF.MAGN.VENTIL 32U7115 TIL ÅBNE-/SANITÆRE ANLð
At first sight I thought it might be the 'Æ' character that messes up the string, but then I saw another record displaying the 'Æ' character as it should (record no. 4).
The error I'm getting when using CCSID 277 for the CHAR fields is as following:
SQLSTATE=22504 SQLCODE=-191 Mixed data or UTF-8 data not properly formed.
:wave: Hi! This issue has been marked stale due to inactivity. If no further activity occurs, it will automatically be closed.
Go away, stale-bot!
@commIT-Esbjerg @patrickhrastnik Can you each please give your ILE job locale information (CNTRYID, LANGID, Job CCSID, default Job CCSID when running node (eg. system dspjob | grep -E 'CNTRYID|LANGID|CCSID'
)?
Given that we can partially recreate the issue, hopefully with this we can more accurately recreate it and use internal tools to trace and diagnose the issue.
@kadler Here's the output of this command on our two systems.
Test system:
patrick@testas:~$ system dspjob | grep -E 'CNTRYID|LANGID|CCSID'
Sprachen-ID . . . . . . . . . . . . . . . . : LANGID DEU
Landes- oder Regions-ID . . . . . . . . . . : CNTRYID AT
ID des codierten Zeichensatzes . . . . . . . : CCSID 273
Standard-CCSID . . . . . . . . . . . . . . . : DFTCCSID 273
Name Wert CCSID
QIBM_PASE_CCSID '819' 273
Name Wert CCSID
Production system:
patrick@holter:~$ system dspjob | grep -E 'CNTRYID|LANGID|CCSID'
Sprachen-ID . . . . . . . . . . . . . . . . : LANGID DEU
Landes- oder Regions-ID . . . . . . . . . . : CNTRYID AT
ID des codierten Zeichensatzes . . . . . . . : CCSID 273
Standard-CCSID . . . . . . . . . . . . . . . : DFTCCSID 273
Name Wert CCSID
QIBM_PASE_CCSID '819' 273
@kadler
@commIT-Esbjerg
I was able to reproduce this issue within idb-connector
:
Steps to reproduce:
1) Create the table
CREATE OR REPLACE TABLE TEST.ORDERS(
ORDER_ID CHAR(25) CCSID 277 DEFAULT ' ' NOT NULL,
ERP_NAME CHAR(50) CCSID 277 DEFAULT ' ' NOT NULL);
2) Run the following script
const { dbconn, dbstmt, IN, CHAR } = require('idb-connector');
const insert = `INSERT INTO TEST.ORDERS(ERP_NAME) VALUES(?) with NONE`
const value = 'DANF.MAGN.VENTIL 32U7115 TIL ÅBNE-/SANITÆRE ANLÆG';
const parametersOldFormat = [[value, IN, CHAR]];
// log length of the input string and the bytes length
console.log(`Length of value: ${value.length} byte length: ${(new TextEncoder().encode(value)).length}`);
const connection = new dbconn();
connection.debug(true);
connection.conn('*LOCAL');
const statement = new dbstmt(connection);
statement.prepare(insert, (error) => {
if (error) { throw error; }
statement.bindParam(parametersOldFormat, (error) => {
statement.execute((out, error) => {
if (error) { throw error; }
statement.close();
connection.disconn();
connection.close();
});
});
});
$ node test.js
Length of value: 49 byte length: 52
SQLConnect(0): conn obj [180508270] handler [2]
Prepare().
SQLPrepare(0): INSERT INTO TEST.ORDERS(ERP_NAME) VALUES(?) with NONE
BindParamAsync().
SQLBindParameter(0) TYPE[ 1] SIZE[ 50] DIGI[0] IO[1] IND[ -3] INDEX[0]
Execute().
SQLExecuteAsync(-1):
undefined:0
[Error: SQLSTATE=22504 SQLCODE=-191 Mixed data or UTF-8 data not properly formed.]
As you can see from output above, the the value DANF.MAGN.VENTIL 32U7115 TIL ÅBNE-/SANITÆRE ANLÆG
to insert contains 49 characters but the byte length is 52.
Interesting enough I found that if you use bindParameters instead of bindParam The insert is successful.
const { dbconn, dbstmt, } = require('idb-connector');
const insert = `INSERT INTO TEST.ORDERS(ERP_NAME) VALUES(?) with NONE`
const value = 'DANF.MAGN.VENTIL 32U7115 TIL ÅBNE-/SANITÆRE ANLÆG';
const select = 'SELECT ERP_NAME FROM TEST.ORDERS';
const parameters = [value];
const connection = new dbconn();
connection.debug(true);
connection.conn('*LOCAL');
const statement = new dbstmt(connection);
statement.prepare(insert, (error) => {
if (error) { throw error; }
statement.bindParameters(parameters, (error) => {
statement.execute((out, error) => {
if (error) { throw error; }
statement.exec(select, (result) => { // retrieve the inserted value
console.log(JSON.stringify(result));
statement.close();
connection.disconn();
connection.close();
});
});
});
});
$ node test.js
Length of value: 49 byte length: 52
SQLConnect(0): conn obj [1804c5310] handler [2]
Prepare().
SQLPrepare(0): INSERT INTO TEST.ORDERS(ERP_NAME) VALUES(?) with NONE
BindParameters().
SQLBindParameter(0) TYPE[ 1] SIZE[ 50] DIGI[0] IO[3] IND[ 52] INDEX[0]
Execute().
SQLExecuteAsync(0):
SQLNUMRESULTSCOLS(0) Column Count = 0
NO RESULTS :SQLExecuteAsync()
SQLExecDirect(0): SELECT ERP_NAME FROM TEST.ORDERS
SQLNUMRESULTSCOLS(0) Column Count = 1
DescCol(0) index[0] Type[CHAR] Scale[0] Precise[50]
[{"ERP_NAME":"DANF.MAGN.VENTIL 32U7115 TIL ÅBNE-/SANITÆRE ANLÆG "}]
SQLFreeStmt: stmth 3 [SQL_DROP]
SQLFreeStmt(0)
SQLDisconnect: conn obj [1804c5310] handler [2]
SQLFreeConnect: conn obj [1804c5310] handler [2]
SQLFreeConnect[0]
The main difference here is that the indicator is set the length of UTF-8 Strings in bytes (52) within bindParamters vs in bindParams the indicator is set to SQL_NTS
. SQL CLI does not read the proper length of the string buffer and is not valid utf8 therefore the Error: SQLSTATE=22504 SQLCODE=-191 Mixed data or UTF-8 data not properly formed.]. We can fix this by setting the indicator properly like is done in the bindParameters case.
I'm going transfer this issue up to idb-connector to start working on a fix.
I have the issue since I update idb-conector from 1.2.3 to most recent version
Please complete the following information:
Describe the bug
When I try to run an Sql Insert Statement, I get either of these Error Codes.
I try to insert data into an Sql defined table with a CLOB field.
The funny thing is, some months ago (when I implemeted the feature which is now broken), the statement performed without any issues. Now I get the same behavior on both of our IBM i machines. I think the error might origin in our environment or runtime - I just don't understand the error. Hopefully someone of you can help me figure it out.
To Reproduce
Sql Table definition:
node.js TypeScript Code (snippets from my setup - please ask for further details if you need some):
Expected behavior
The supplied Sql Statement should add an entry to table holog/HOLOJP.
Additional context Here's some debug output from the database connection. The json error message was generated after the reject by class Logger. You can find the actual values for the parameters in this output - I just can't find why they should have a wrong format