Closed jkdavew closed 5 years ago
I have recreated the problem and will work on getting it fixed today. Thanks for testing it out and offering feedback!
Fixed, 99% sure I will get it all merged in today. If you have an urgent need to use it, feel free to download from my GitHub:
npm install markdirish/node-odbc#v2.0
That's great...thanks!
On Mon, Jun 10, 2019, 10:51 AM Mark Irish notifications@github.com wrote:
I have recreated the problem and will work on getting it fixed today. Thanks for testing it out and offering feedback!
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/wankdanker/node-odbc/issues/67?email_source=notifications&email_token=ADYV4QJFGOYEA7RROYI5RA3PZZS7JA5CNFSM4HVIQKTKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODXKDEGI#issuecomment-500445721, or mute the thread https://github.com/notifications/unsubscribe-auth/ADYV4QNSVI2ILVZ4SFHBEHTPZZS7JANCNFSM4HVIQKTA .
Fantastic...I'll check the live repo tomorrow or will experiment with yours! Appreciate the quick turnaround!
On Mon, Jun 10, 2019, 11:29 AM Mark Irish notifications@github.com wrote:
Fixed, 99% sure I will get it all merged in today. If you have an urgent need to use it, feel free to download from my GitHub:
npm install markdirish/node-odbc#v2.0
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/wankdanker/node-odbc/issues/67?email_source=notifications&email_token=ADYV4QJJLNGNCN7OTVN36KLPZZXMVA5CNFSM4HVIQKTKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODXKGTAY#issuecomment-500459907, or mute the thread https://github.com/notifications/unsubscribe-auth/ADYV4QJDO7UHGZ2UPAIKNO3PZZXMVANCNFSM4HVIQKTA .
After the npm update it worked! Thanks!
Noticed the beta has been removed and my old test didn't work.
This connection technique has been dropped:
const { Connection }=require('odbc');
const connection = new Connection('DSN=MYNAME');
New method was to use callback/promise...took a couple of minutes to adjust with the listed examples, but works now.
Totally unrelated, but I figured I'd let you know, several examples show "INSTERT" instead of "INSERT"...think it's wrong unless I'm missing some new SQL keyword :).
Yeah, the old constructor method of creating a Connection was having poor performance (since constructors are synchronous). By doing odbc.connect('DSN=MYNAME'), it shouldn't have little hangs whenever a connection is made.
I will fix the typos, thank you!
Ok, thanks for clarifying the Connection changes.
We have found another issue with numeric data and "carry down / memory" issues. I can raise it as a separate one, or we can re-open this issue.
Just a hunch that it's related to these stevements in src/odbc.cpp - note: I'm not a c++ person.
maxColumnLength = (column->ColumnSize + 1) * sizeof(SQLCHAR);
const odbc = require("odbc");
async function queryExample() {
const pool = await odbc.pool("DSN=AS400PCR");
const result = await pool.query(
"select my_numeric_data from mylib/myfile order by my_numeric_data"
);
console.log(result);
}
queryExample();
Data Setup:
create table mylib/myfile (
my_numeric_data decimal(3,0)
);
insert into mylib/myfile values(1);
insert into mylib/myfile values(2);
insert into mylib/myfile values(3);
insert into mylib/myfile values(4);
insert into mylib/myfile values(5);
insert into mylib/myfile values(6);
insert into mylib/myfile values(7);
insert into mylib/myfile values(8);
insert into mylib/myfile values(9);
insert into mylib/myfile values(10);
insert into mylib/myfile values(11);
insert into mylib/myfile values(12);
insert into mylib/myfile values(13);
insert into mylib/myfile values(14);
insert into mylib/myfile values(15);
The expected result is that the data would come out as 1,2,3, etc, but it gets messed up from time to time - sometimes hard to reproduce, other times very consistently bad (example below):
[ { MY_NUMERIC_DATA: 1 }, { MY_NUMERIC_DATA: 2 }, { MY_NUMERIC_DATA: 3 }, { MY_NUMERIC_DATA: 4 }, { MY_NUMERIC_DATA: 5 }, { MY_NUMERIC_DATA: 6 }, { MY_NUMERIC_DATA: 7 }, { MY_NUMERIC_DATA: 8 }, { MY_NUMERIC_DATA: 9 }, { MY_NUMERIC_DATA: 10 }, { MY_NUMERIC_DATA: 11 }, { MY_NUMERIC_DATA: 12 }, { MY_NUMERIC_DATA: 13 }, { MY_NUMERIC_DATA: 14 }, { MY_NUMERIC_DATA: 15 }, statement: 'select my_numeric_data from mylib/myfile order by my_numeric_data', parameters: [], return: undefined, count: -1, columns: [ { name: 'MY_NUMERIC_DATA', dataType: 3 } ] ]
What does it look like when it is bad/corrupted? I recreated the table and ran the query, and get output like you showed in the example.
I work with Dave.
Sometimes a 5 will turn into a 51, or a 2 will become a 21. It seems that if it's a single digit response, an extra digit sometimes gets added on to it. When doing queries from 1 to 15, I've never seen the extra digit greater than 5 if that helps.
So far it happens with UDTFs in RPG returning packed fields and also reading a DDS file that has packed fields. I haven't tried DDL files yet so don't rule it out.
Also note that usually it's fine the first run or two, it's usually when it's repeated a few times. If you repeated the "queryExample();" line 3 times, it happens for me. Basically when the same query is called 3 times concurrently and asynchronously.
Here's how it looks when it's bad:
[
{ MY_NUMERIC_DATA: 18, COL2: 'Hello: 1' },
{ MY_NUMERIC_DATA: 27, COL2: 'Hello: 2' },
{ MY_NUMERIC_DATA: 3, COL2: 'Hello: 3' },
{ MY_NUMERIC_DATA: 4, COL2: 'Hello: 4' },
{ MY_NUMERIC_DATA: 5, COL2: 'Hello: 5' },
{ MY_NUMERIC_DATA: 6, COL2: 'Hello: 6' },
{ MY_NUMERIC_DATA: 7, COL2: 'Hello: 7' },
{ MY_NUMERIC_DATA: 8, COL2: 'Hello: 8' },
{ MY_NUMERIC_DATA: 9, COL2: 'Hello: 9' },
{ MY_NUMERIC_DATA: 10, COL2: 'Hello: 10' },
{ MY_NUMERIC_DATA: 11, COL2: 'Hello: 11' },
{ MY_NUMERIC_DATA: 12, COL2: 'Hello: 12' },
{ MY_NUMERIC_DATA: 13, COL2: 'Hello: 13' },
{ MY_NUMERIC_DATA: 14, COL2: 'Hello: 14' },
{ MY_NUMERIC_DATA: 15, COL2: 'Hello: 15' },
statement: "select my_numeric_data, 'Hello: ' || my_numeric_data " +
'COL2 from mylib/myfile order by my_numeric_data asc',
parameters: [],
return: undefined,
count: -1,
columns: [
{ name: 'MY_NUMERIC_DATA', dataType: 3 },
{ name: 'COL2', dataType: 12 }
]
]
note the first record - just the numeric field is bad...when I added an artificial 2nd column of data concatenating the numeric with the word "hello", that data is fine.
Alright, I have a script running that seems to be able to recreate it (sometimes!). Will try and debug it
that's great!! Seems to be limited to the numeric data if that helps.
Someone else seems to be having an issue with floating point numbers on the same fork. It's probably related.
The float issue is actually unrelated (well, related to the original issue in this thread), but I think I have solved them both. Before I push to the master branch and update the npm package, would you mind verifying the fix on your tests? It can be found on the branch:
https://github.com/markdirish/node-odbc/tree/bound_row_data
(Also in the future if you wouldn't mind posting issues on my fork of this repo. I keep an eye on both of them, but with the update to 2.0 im trying to keep the newer stuff over there. Thanks!)
I did some testing and it appears to work so far. (I tested with UDTFs and DDS files.)
I'll let you know if there are any more issues.
Mark - that's great! Thanks again for the fix. I'll look to post any future issues to your repo. What's the timing for publishing this to npm?
Fixing up an issue with callProcedure, cleaning up the README, and hoping to push to npm in the next hour!
Should be available now, 2.0.0-3
great - thanks! we're continuing to check results...will post to your fork if anything comes up.
When querying a table with a packed decimal field, the value isn't being returned properly.
For instance, a DB value of 1 in the database is being returned like this 6.9073350249177e-310.
A workaround seems to be to modify a query like this.
For example this : SELECT AGE FROM TABLE
Would become: SELECT INT(AGE) AGE FROM TABLE
This seems to be an issue with the node package since when directly using the unixodbc isql command the data is returned properly.
Thanks