mysqljs / mysql

A pure node.js JavaScript Client implementing the MySQL protocol.
MIT License
18.22k stars 2.53k forks source link

Clarification around parallelization #2527

Closed maslade closed 2 years ago

maslade commented 2 years ago

Per the documentation:

The MySQL protocol is sequential, this means that you need multiple connections to execute queries in parallel. You can use a Pool to manage connections, one simple approach is to create one connection per incoming http request.

If I understand this right, I can safely parallelize my requests to the database, but that the client will serialize them behind the scenes and I won't get any functional benefit from it.

To demonstrate what I'm asking for clarification on:

function somethingTransactional() {
  return new Promise((resolve, reject) => {
    db.beginTransaction(err => {
      db.query('...', err => {
        db.query('...', err => {
          db.commit(err => /* resolve or reject */ );
        }
      }
    }
  }
}

await Promise.all(
  somethingTransactional(),
  somethingTransactional(),
  // ...
);

I'm trying to confirm that with this approach, while node will "parallelize" the work, each transaction would be executed in serial, with no possibility that the MySQL client would intersperse queries from invocation of somethingTransactional() call with those from another. Effectively the same as if I'd not used Promise.all but a sequence of await somethingTransactional(); calls.

dougwilson commented 2 years ago

Yes, your understanding is correct. It is not the library doing it on purpose, but just how the MySQL protocol itself works. If you want them to run in parallel, you need a new connection per each chunk of queries you want to run in parallel.

maslade commented 2 years ago

Great - and thank you for the quick response.