Closed fluffynuts closed 5 months ago
Thanks @fluffynuts, ππ»ββοΈ
By including your test into test/integration
directory, it will be performed when running the npm run test
.
Then, it's not necessary to add the test-me
in the package.json π§π»
To test a specific file, you can use the FILTER
environment variable:
FILTER=test-multi-result-streaming npm run test
Also, to keep good practices, there are some small suggestions:
Change from
const { createConnection } = require('../common.test.cjs');
(async function() {
'use strict';
const { assert } = require('poku');
// ...
To
'use strict';
const { assert } = require('poku');
const { createConnection } = require('../common.test.cjs');
(async () => {
// ...
thanks @wellwelwel
I didn't know how to filter poku tests (this is the first time I've seen poku, tbh), so that was supposed to be a temporary npm script until I had a passing test (but it turned out I could run through webstorm anyway, so...) - anyways, it's removed and the requested changes have been made
@fluffynuts, sorry, I hadn't noticed something before:
You need to name your test file as *.test.cjs
in the test/integration/**
path ππ»ββοΈ
- test/integration/test-multi-result-streaming.cjs
+ test/integration/test-multi-result-streaming.test.cjs
All modified and coverable lines are covered by tests :white_check_mark:
Project coverage is 90.32%. Comparing base (
9edfd72
) to head (f5bb633
).
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
@fluffynuts, for some reason the test/integration/test-multi-result-streaming.test.cjs:54:12
brings a random result:
Can you check it?
You can also test it by using the native Node.js command:
node test/integration/test-multi-result-streaming.test.cjs
I don't think that it's related to this issue, but you don't need to try-catch
the assert:
- try {
assert.equal(captured1.length, 1);
assert.equal(captured2.length, 1);
assert.deepEqual(captured1[0], compare1[0][0]);
assert.deepEqual(captured2[0], compare2[0][0]);
- process.exit(0);
- } catch (e) {
- console.error(e);
- process.exit(1);
- }
ππ»ββοΈ
About the lint error, you can just run npx prettier --write ./test/integration/test-multi-result-streaming.test.cjs
π§π»
@wellwelwel I've finally found a fail (sorry, it's right at the end, and searching for my test name was coming up blank)
I've literally just run this test more than 100x without issue, so perhaps it's specific to node 21 and / or something else. I'll investigate, but I see a lot of errors like Warning: got packets out of order. Expected 27 but received 2
and I don't even know the source of that error - searching this project doesn't turn it up. I'd appreciate any help on this as I'm not a full-time node dev and I don't always have time to noodle about with non-lts node variants. Like I say, I'll have a look, but I don't know if I'll get anywhere.
I also notice that the errors only occur with node >= 20 and SSL disabled for mysql 8.0.33, with errors about packets. I'm wondering if there's an SSL-related issue in there somewhere.
Edit: using mysql 8.0.35 on windows doesn't repro the issue at all, with node 20 or 21.
I'm wondering if an update between 8.0.33 and 8.0.35 fixed something. There was an SSL-related update in 8.0.34 which affects systems shipping their own SSL libraries (eg good linux distros):
would it be possible to update the 8.0.33 matrix target to 8.0.35 to test the idea?
I don't believe that it's related to a specific Node and MySQL version.
For example, this time the previous issue happened for Node.js 14, 16, also for MySQL 8.0.33 and SSL 0
and 1
(same for compression).
This really does seem to happen at random. Apart from a real debugging, the only similarity is that errors only occur on Linux, not on Windows and OSX.
About the "Warning: got packets...", it's not causing the error ππ»ββοΈ The problem occurs here:
assert.deepEqual(captured2[0], compare2[0][0]);
Also, you don't need to force the process exit code ππ»ββοΈ
- process.exit(0);
I'd appreciate any help on this
I can try to do some debugging at the weekend, but in any case I've added a help wanted
label π€
Hey @fluffynuts, there's a CI exclusive for MySQL versions:
In that case, we already use the mysql:latest
ππ»ββοΈ
enable GHA for this branch, hopefully
@fluffynuts, you can submit a PR to your fork to test independently, it will enable all the CI for pull request events in your own fork π§π»
I've now run this test around 400x with and without TLS on an unbuntu 22.04.4 machine with mysql 8.0.36 (just installed) - no errors. I'm going to try bumping the mysql version in the github action file.
Hey @fluffynuts, there's a CI exclusive for MySQL versions:
In that case, we already use the
mysql:latest
ππ»ββοΈ
cool, but that's not the fixture that's failing - linux-ci is failing on version 8.0.33. I've tried upping that to 8.0.36, and I get more failures (node 14 as well!), but all of them have this junk in their output:
and I haven't seen this in about 1000x runs on an ubuntu vm in virtualbox.
Can you try using a Docker container?
Usually, I use the node:${version}-alpine
images, but for this case, I think using node:${version}
is better to test it ensuring the Ubuntu environment.
If you don't know how to reproduce these tests on Docker, check the https://github.com/sidorares/node-mysql2/issues/2493#issuecomment-1992619188 ππ»ββοΈ
As before, the error occurs on assert (assert.deepEqual(captured2[0], compare2[0][0])
), these warnings aren't causing the CI failure.
Also, forcing the process exit code to 0
can hinder the debugging process:
- process.exit(0);
after running the full suite in the same ubuntu vm, I am seeing the same "Warning: got packets out of order", but those tests don't fail, so that's odd.
I'll try to get docker running
As before, the error occurs on assert (assert.deepEqual(captured2[0], compare2[0][0])), these warnings aren't causing the CI failure.
I'd agree that warnings (in themselves) don't cause test failures. I'd also postulate that if network comms to the mysql server are not running correctly, writing code that assumes that it can query data from the database without error may well be in error. I only saw these warnings on the ubuntu GHA hosts, until right now when I ran the full set of tests (but I only see those warnings on other tests which aren't failing). What I've read about that warning suggests that data can be lost during transmission when that occurs - which would explain the failure.
Can you please revert the changes in the .github/workflows/ci-linux.yml
?
Can you please revert the changes in the
.github/workflows/ci-linux.yml
?
Sorry about that, but I don't see an advantage to change MySQL version on Node / Bun CI.
It should fit more in CI - MySQL
, but since we've already gotten the error and it's happening in different cases, I don't see why we should include that version in the matrix at this moment.
Can you please revert the changes in the
.github/workflows/ci-linux.yml
?Sorry about that, but I don't see an advantage to change MySQL version on Node / Bun CI.
It should fit more in
CI - MySQL
, but since we've already gotten the error and it's happening in different cases, I don't see why we should include that version in the matrix at this moment.
similarly, there's no disadvantage; however I've reverted.
similarly, there's no disadvantage; however I've reverted.
It's not about that @fluffynuts. The CI for Node & Bun is focused in test the versions of Node and Bun using a stable MySQL Server version along the time.
Then, the CI MySQL
is about to test different versions of MySQL Server using a stable version of Node.
You can see this PR for more details:
If you don't know how to reproduce these tests on Docker, check the #2493 (comment) ππ»ββοΈ
I've attempted to do what is laid out in that issue. As I understand it, the point here is to isolate to a specific version of mysql, on a linux host (because that's what docker on windows will provide)?
Unfortunately, this is what I get, multiple attempts. I'm no docker guru, so I have no idea how to continue from here.
Look at this:
uncaughtException [Error: B81A0000:error:0A00041A:SSL routines:ssl3_read_bytes:tlsv1 alert decode error:c:\ws\deps\openssl\openssl\ssl\record\rec_layer_s3.c:1590:SSL alert number 50
] {
library: 'SSL routines',
reason: 'tlsv1 alert decode error',
code: 'ERR_SSL_TLSV1_ALERT_DECODE_ERROR',
fatal: true
}
β 108/115 node test\integration\test-multi-result-streaming.test.cjs
By removing process.exit(0)
, we now have more debugging details, including the fact that this time the failure also occurred on Windows and OSX.
ok, well then I have no idea how to continue with this - very obviously, adding support for dataset index has nothing to do with TLS. That's an environmental error, surely?
ok, well then I have no idea how to continue with this - very obviously, adding support for dataset index has nothing to do with TLS. That's an environmental error, surely?
This is a great advantage of automated testing, ensuring that a change has no side effects π§π»ββοΈβ¨
ok, well then I have no idea how to continue with this - very obviously, adding support for dataset index has nothing to do with TLS. That's an environmental error, surely?
This is a great advantage of automated testing, ensuring that a change has no side effects π§π»ββοΈβ¨
agreed, if the change and error can be linked somehow - I understand you're saying you believe these three lines have caused the observed SSL issues:
I would ask you have a look at this run from another fork: https://github.com/fluffynuts/node-mysql2/actions/runs/8292109504/job/22692880896
All I've done is copy in my streaming test and make it not care about the fact that there's multiple datasets (ie, expect the dataset index not to be passed, as is the case in node-mysql2 right now in main
). That test fails too. Perhaps that test code triggers some bad behavior? Can anyone spot an obvious error in there? Whilst I do sometimes write code for node, it isn't my primary focus, so I'm quite willing to accept that my test code is faulty some how.
One point in particular that I haven't been 100% sure about (and perhaps this is causing comms mashups): I wanted to know about the moment when we switch from one dataset to another - and I saw that (a) the stream emits readable
and (b) if I don't .read()
, then the process hangs, so I have to .read()
. But now I'm wondering if that .read()
is interleaved in the middle of the stream handler doing stuff, causing the problem. And if so, that opens a whole new box to investigate, though personally, I'm no closer to a solution π€·
I'm not using my devices these days, so I can do some more limited things. But I intend to debug this carefully.
I'm going to try out a few things, if there are any positive results, I'll bring them back here ππ»ββοΈ
@wellwelwel sorry to bother, just an update that we're running our fork in production on a fairly busy system and all seems ok, so I have confidence in my patch, but would like to know what's breaking here ):
@fluffynuts, sorry to keep you waiting.
I noticed a simple issue from results order, I've also changed from .destroy
to .end
.
Additionally, I just ensured some ESLint rules were not being checked by CI.
Now, all the tests have been passed. Could you review it (#2628)?
@fluffynuts, thanks again for your work!
I preferred not to modify your branch directly, but since all your commits were preserved in PR #2628, welcome to MySQL2 contributors π
I've also linked this PR to the release (#2719).
And again, apologies for the delay.
References from mysqljs:
When upgrading from
mysql
tomysql2
, to support authentication updates on our mysql server(s), I found that my code around streaming, particularly when streams contain multiple resultsets, wasn't working properly because the resultset index wasn't being passed through as it is inmysql
. This PR aims to resolve that.