sidorares / node-mysql2

:zap: fast mysqljs/mysql compatible mysql driver for node.js
https://sidorares.github.io/node-mysql2/
MIT License
4.07k stars 618 forks source link

Error: Can't add new command when connection is in closed state #939

Closed Rakeshvishnoi029 closed 2 years ago

Rakeshvishnoi029 commented 5 years ago

Hi

I fascing last 2 days this error please help me...

{ Error: read ETIMEDOUT at TCP.onread (net.js:622:25) errno: 'ETIMEDOUT', code: 'ETIMEDOUT', syscall: 'read', fatal: true } { Error: Can't add new command when connection is in closed state at PoolConnection._addCommandClosedState

I use mysql 2 and connect Pool

var mysql = require('mysql2'); var mysqlPool = mysql.createPool({
host: 'localhost', user: 'root', password: 'xyz', database: 'xyz', waitForConnections: true, connectionLimit: 10, queueLimit: 0 });

module.exports = mysqlPool;

55|| { Error: read ETIMEDOUT 55|| at TCP.onread (net.js:622:25) 55|| errno: 'ETIMEDOUT', 55|| code: 'ETIMEDOUT', 55| | syscall: 'read', 55|| fatal: true } 55|| { Error: Can't add new command when connection is in closed state 55| at PoolConnection._addCommandClosedState

digiperfect-janak commented 5 years ago

+1 we have been getting this issue since yesterday.

Error: This socket has been ended by the other party

followed by

Error: Can't add new command when connection is in closed state

yunfan commented 5 years ago

we too :[

zebullax commented 5 years ago

Same issue here, the only hack is to stop and restart the node app...

Akash0333 commented 5 years ago

Same issue here, please tell me how we can resolve it

ovidiu-balau commented 5 years ago

Anyone found a fix for this?

nag944 commented 5 years ago

Any news on this problem ? After connection timed out, it cannot be restored at all - as all activity related to network side is performed in constructor. Also, this error is returned as text message only, no codes provided.

Is it safe just to recreate connection object ? Actually it's very pour idea, as connection instance is passed to many structures in the program...

ovidiu-balau commented 5 years ago

Still an issue, anyone found any solution for this?

jovemnf commented 5 years ago

me too

ganchuhang commented 5 years ago

I got this as well

sidorares commented 5 years ago

Hey @coder77 @jovemnf @ovidiu-balau @Akash0333 @zebullax and everyone else experience issue - is there a reliable way to reproduce? Ideally dockerized self contained , or at least sequence of instructions

nag944 commented 5 years ago

Hey @coder77 @jovemnf @ovidiu-balau @Akash0333 @zebullax and everyone else experience issue - is there a reliable way to reproduce? Ideally dockerized self contained , or at least sequence of instructions

It's extremely easy to reproduce - just start a connection and wait for default timeout (you can emulate it changing default mysql timeout setting to some smaller value - by default it's around 8 hours).

Btw, I tested it not on pool, but with raw connection - in this case error object is empty at all, only text message is present.

As I checked source code, seems this problem has no way to be solved in current state. When mysql connection is gone away, the network connection instance data inside the component is destroyed, so you cannot just restart the net - must create a new connection instance at all (network connection instance is created in the constructor !!!).

sidorares commented 5 years ago

As I checked source code, seems this problem has no way to be solved in current state.

Yes, individual Connection is not meant to survive underlying transport reconnect, it's 1:1 mapping to mysql connection / thread with same possible states ( authentication, change user, etc ). When network connection dies or actively closed by server it should not be re-used, this is just how it is designed. If it's your issue the solution should be in your code ( listen for error / end events etc ). If you want simpler api just use pool - it should handle all this

nag944 commented 5 years ago

But as I see previous messages, pool seems to have same problems ?

The only way I found is just send stub queries every hour to keep connection alive. In the other case, I cannot even make correct destruction - it hangs forever, so must forget about the instance and create new one - rely that garbage collector makes it's job right.

zebullax commented 5 years ago

Unless I'm mistaken, issue is not limited to single connection but also connection managed by pool (that s what I was using as a setup), and that would make sense since the pool is only managing creating and yielding connection, if an existing connections has been remotely closed, and is yielded by the pool, we get the same error that if we were working with a simple connection

sidorares commented 5 years ago

pool seems to have same problems ?

I'm happy to try to investigate, if I have simple way to reproduce with pool that would help me a lot

if an existing connections has been remotely closed, and is yielded by the pool, we get the same error that if we were working with a simple connection

In the case of pool it's different in terms of who is responsible for tracking and recycling of dead connection. Pool should do it all for you while when you use individual connection you must listen for error ( both on connection and on query results )

MR4online commented 5 years ago

having the same problem with pooled connections. If our MySQL stats are correct, the count of the active connections does not vary over time. The last time the error occured was when an cron caused many queries. May be the aggregation of new connection doenst work as expected. Is there an way to debug that?

Our code looks pretty much like in the docs https://github.com/mysqljs/mysql#pooling-connections. The most logic is simple (get new Connection from permanent Pool, query it, release it).

ADumaine commented 5 years ago

I'm using a single connection and after the connection is lost con.end() or con.release() throws the error. My current solution is to set the connection object to null and create a new connection on the error.

if (con) con=null initcon(); //sets up the connection

T1MOXA commented 5 years ago

I have the same problem

Xsmael commented 5 years ago

Mee toO! any fix ?

juaneuro90 commented 5 years ago

Guys I'm having the same issue.

jmordica commented 5 years ago

Me too.

jmordica commented 5 years ago

Switching to createPool resolved my issue.

franck-nadeau commented 4 years ago

Switching to createPool resolved my issue.

@jmordica What do you mean by switching to createPool? That is what I am using and I am getting this problem. Also that is what Rakeshvishnoi029's initial code used. Are you passing something special as an option?

zeeshanaligold commented 4 years ago

Hi @jmordica

I have the same issue. Can you please tell how did you get it fix? Thx

const mysql = require('mysql2');

const con = mysql.createConnection({
    host     : process.env.RDS_HOSTNAME,
    user     : process.env.RDS_USERNAME,
    password : process.env.RDS_PASSWORD,
    database : process.env.RDS_DATABASE
});
ghost commented 4 years ago

I have same issue!

rudiantotiofan commented 4 years ago

any update for this issue ? i got the same problem.

Thanks,

davehorton commented 4 years ago

Same issue here, using createPool. Any updates on this?

const pool = mysql.createPool({
  host: process.env.XX,
  user: process.env.XXX,
  password: process.env.XX,
  database: process.env.XX,
  connectionLimit: process.env.XX
});
"err":
{"type":"Error","message":"Can't add new command when connection is in closed state","stack":"Error: Can't add new command when connection is in closed state\n    
at PoolConnection._addCommandClosedState (/../mysql2/lib/connection.js:137:17)\n    
at PoolConnection.query (../mysql2/lib/connection.js:502:17)\n    
at getMysqlConnection (../login.js:31:10)\n    
at process.nextTick (../mysql2/lib/pool.js:44:37)\n    
at process._tickCallback (internal/process/next_tick.js:61:11)","fatal":true},"msg":"Error getting db connection","v":1

at least update the docs to remove the misleading information that pool handles this for you.

sidorares commented 4 years ago

@davehorton unfortunately not enough information to debug. Can you make a test repo to reproduce?

This can happen if you call .end() on connection and later try to call .execute() this might be a logic error in your application but could be a bug in the driver, unfortunately not enough information to tell

runekm commented 4 years ago

(EDIT: I see that I was running an old version of the package (1.7.0). I tried upgrading to the newest vesrion. I don't know if there is any difference yet)

I too get this error from time to time. I haven't investigated the problem in detail, but it seems to occur every couple of days, and lasts untill the app is restarted.

The connection pool is created with:

var mysql = require('mysql2');
pool = mysql.createPool({
    connectionLimit : config.DB_CONNECTION_LIMIT,
    host            : config.DB_HOST,
    port            : config.DB_PORT,
    user            : config.DB_USER,
    password        : config.DB_PASSWORD,
    database        : config.DB_DATABASE,
    dateStrings     : true,
    debug           : config.DB_DEBUG
});
promisePool = pool.promise();

The error occurs when I run: [ results, fields ] = await connection.query( sql ); after running: connection = await promisePool.getConnection(); which gives no error.

The error says: Error: Can't add new command when connection is in closed state at PromisePoolConnection.query ([path_to_my_app]/node_modules/mysql2/promise.js:92:22

d3rf commented 4 years ago

I solved in a class like doing something like that.

`class MySQLExec { //propriedades e funções da classe aqui constructor() {

}
conecta(){
  this.conn =  mysql.createConnection({
    host: '',
    user: '',
    database: '',
    password:config.password,
    multipleStatements: true
   // waitForConnections: true
  });  
}
run(query,rows) {
    var returnResult = null;
   //connect
    this.conecta();
    this.conn.query(
        query,
        rows,
        function(err, results, fields) {
          returnResult = results;              
         // results contains rows returned by server

         if (err){
           console.error(err)
         }

         // console.log(fields); // fields contains extra meta data about results, if available
         // return results;
          // If you execute same statement again, it will be picked from a LRU cache
          // which will save query preparation time and give better performance
          //
        }
      );
     //disconnect after run
      this.conn.end();
      //this.conn.end();
      return returnResult; 
}

}`

skorolev-tjc commented 4 years ago

I had the same issue with pooled connections and 8.0.15 server and to make things worse I had also long running connections. For now I have a cautious optimism that the issue is resolved. Some changes must be done in your logic:

  1. If you need long running connections then first check MySQL server parameter wait_timeout - it is in seconds and it will kill silent connections after the period of time, probably it will be sufficient for you just to increase this value
  2. If wait_timeout is not sufficient you can use connection.ping method with some interval - it will keep the connection alive
  3. Do analyse error object after operations: if it has fatal = true then the connection became useless (probably it was closed somehow or connection was lost) - you must call connection.destroy() - it will remove this poisoned connection from the pool otherwise the pool can provide this connection again – probably some change should be implemented in the pool logic.
gaurav16694 commented 4 years ago

is there any update on this issue either how to reproduce it or fix it ?

HsinHeng commented 4 years ago

In my case, I perform insertMany exceed 30,000 records at one time. The error was produced.

I resolved my issue by increase max_allowed_packet in mysql server config. because the transmission bytes are too large.

sidorares commented 4 years ago

@HsinHeng interesting, maybe that can be solved at driver level. We should be able to automatically detect max_allowed_packet ( from settings or by querying variable immediately after connection ) and then splitting outgoing packet into smaller packets - see https://dev.mysql.com/doc/internals/en/sending-more-than-16mbyte.html

mysql2 does support large incoming packet sizes ( see https://github.com/sidorares/node-mysql2/blob/442d3042245c761eaeafe36b0e35d45cb3f87025/lib/packet_parser.js#L83-L97 ) but your issue is probably related to packet length > max_allowed_packet and not 16Mbyte packets. I need to double check maybe max_allowed_packet can be automatically sent at connection but if not still can be queried using .query()

sidorares commented 4 years ago

@HsinHeng what was exactly error code in your case? Having some help in error message would be useful I guess. Something along the lines if(error.code === errors.ER_NET_PACKET_TOO_LARGE) { error.message = "Packet too large. Check max_allowed_packet system/session variable. See [wiki url with more info] for more information" }

HsinHeng commented 4 years ago

@HsinHeng what was exactly error code in your case? Having some help in error message would be useful I guess. Something along the lines if(error.code === errors.ER_NET_PACKET_TOO_LARGE) { error.message = "Packet too large. Check max_allowed_packet system/session variable. See [wiki url with more info] for more information" }

@sidorares You got it!

Here is my error details.

"name":"SequelizeDatabaseError","parent":{"code":"ER_NET_PACKET_TOO_LARGE","errno":1153,"sqlState":"08S01","sqlMessage":"Got a packet bigger than 'max_allowed_packet' bytes" }

Hope you could resolve this issue.

sidorares commented 4 years ago

@HsinHeng well in your case error text from the server already had some hints

What we can do is have maxAllowedPacketSize connection option that maybe is "auto" by default. When set to auto a query is performed immediately on connection to get max_allowed_packet variable ( downside: slightly worse latency until first "useful" response, especially when connection is always shoer lived )

greedThread commented 3 years ago

Still an issue, anyone found any solution for this?

Veijar commented 3 years ago

the same problem

Desnoo commented 3 years ago

We also have this issue, when the connections in the connection pool are long in idle mode. I debugged to the getConnection point and printed out the connection state that is returned at https://github.com/sidorares/node-mysql2/blob/07a429d9765dcbb24af4264654e973847236e0de/lib/pool.js#L45

The contents of the connection are as follows:

  1: connection =
    { _events: Object,
      _eventsCount: 1,
      _maxListeners: 'undefined',
      config: ConnectionConfig,
      stream: Socket,
      ... }
  2: connection._fatalError =
    { errno: 'ECONNRESET',
      code: 'ECONNRESET',
      syscall: 'read',
      fatal: true,
      stack: 'Error: read ECONNRESET\n' +
      '    at TCP.onStreamRead [as…tic-apm-node/lib/instrumentation/index.js:328:27)',
      ... }
  3: connection._protocolError =
    { fatal: true,
      code: 'PROTOCOL_CONNECTION_LOST',
      stack: 'Error: Connection lost: The server closed the conn…js:483:12)\n' +
      '    at TCP.<anonymous> (net.js:675:12)',
      message: 'Connection lost: The server closed the connection.',
      Symbol(callsites): Array(4) }
  4: connection._statements =
    { Symbol(max): 16000,
      Symbol(lengthCalculator): ,
      Symbol(allowStale): false,
      Symbol(maxAge): 0,
      Symbol(dispose): ,
      ... }
  5: connection._internalId = 3

So the error seems to come in effect when the mysql server closes the connection "Connection lost: The server closed the connection.". Do you have any hints to prevent this?

JonHX commented 3 years ago

Same issue here

SouzaCarleone commented 3 years ago

Error: Can't add new command when connection is in closed state

Same issue here, any suggestion ?

yo-wan commented 3 years ago

Possible workaround could be to check if any of connection's properties _fatalError, _protocolError, _closing is true before executing the query and to get new connection. But that is ugly and looking for better solution, probably based on events.

irepela commented 3 years ago

I solved this problem by refactoring my code to use pool.query (which releases connection internally) instead of using connection.query and releasing connection manually. Looks like there was a connection leak in my code.

MaksemM commented 3 years ago

I solved this problem by refactoring my code to use pool.query (which releases connection internally) instead of using connection.query and releasing connection manually. Looks like there was a connection leak in my code.

Do you have an example? I did the same thing using pool.query and it worked fine for much longer, but then it happened again. Did yours permanently fix?

irepela commented 3 years ago

It didn't happen again in my case. Here is an example what was change:

Old code: let connection; try { connection = await pool.getConnection(); const [rows] = await connection.query(SELECT_USERS, [id]); return res.send(rows); } catch (sqlError) { await handleError(res, sqlError); } finally { await connection.release(); }

New code: try { const [rows] = await pool.query(SELECT_USERS, [id]); return res.send(rows); } catch (sqlError) { await handleError(res, sqlError); }

ChromeUniverse commented 3 years ago

It didn't happen again in my case. Here is an example what was change:

Old code: let connection; try { connection = await pool.getConnection(); const [rows] = await connection.query(SELECT_USERS, [id]); return res.send(rows); } catch (sqlError) { await handleError(res, sqlError); } finally { await connection.release(); }

New code: try { const [rows] = await pool.query(SELECT_USERS, [id]); return res.send(rows); } catch (sqlError) { await handleError(res, sqlError); }

Has anyone else been able to confirm that this works or found other fixes? I'll try refactoring my code anyways to test this out, but I would love to get more information if possible.

I'm currently using mysql2's promise API to create the connection but also getting the same error:

// connect to MySQL database
async function sql_connect() {
  // connect to database
  db = await mysql.createConnection({
    host     : 'localhost',
    user     : 'lucca',
    password : process.env.MYSQL_PASSWORD, 
    database : 'tank_battle'
  });

  return db;
}
MaksemM commented 3 years ago

It didn't happen again in my case. Here is an example what was change:

Old code: let connection; try { connection = await pool.getConnection(); const [rows] = await connection.query(SELECT_USERS, [id]); return res.send(rows); } catch (sqlError) { await handleError(res, sqlError); } finally { await connection.release(); }

New code: try { const [rows] = await pool.query(SELECT_USERS, [id]); return res.send(rows); } catch (sqlError) { await handleError(res, sqlError); }

Thanks for that. I changed my code to use promise pool and it seems to have resolved itself, but I'm still cautious. Thanks for showing me the snippet.

MaksemM commented 3 years ago

It didn't happen again in my case. Here is an example what was change: Old code: let connection; try { connection = await pool.getConnection(); const [rows] = await connection.query(SELECT_USERS, [id]); return res.send(rows); } catch (sqlError) { await handleError(res, sqlError); } finally { await connection.release(); } New code: try { const [rows] = await pool.query(SELECT_USERS, [id]); return res.send(rows); } catch (sqlError) { await handleError(res, sqlError); }

Has anyone else been able to confirm that this works or found other fixes? I'll try refactoring my code anyways to test this out, but I would love to get more information if possible.

I'm currently using mysql2's promise API to create the connection but also getting the same error:

// connect to MySQL database
async function sql_connect() {
  // connect to database
  db = await mysql.createConnection({
    host     : 'localhost',
    user     : 'lucca',
    password : process.env.MYSQL_PASSWORD, 
    database : 'tank_battle'
  });

  return db;
}

I'll show you what is currently working for me:

Database file:

require('dotenv').config()
const mysql = require('mysql2')

module.exports =  mysql.createPool({
  host: process.env.DB_HOST,
  user: process.env.DB_USER,
  password: process.env.DB_PASSWORD,
  database: process.env.DB_DATABASE,
  timezone: 'Australia/Sydney',
  waitForConnections: true,
  connectionLimit: 10,
  queueLimit: 0,
  debug: false
})

Controller file snippet

const pool = require('../models/db/db')
const promisePool = pool.promise()

createSalesOrderPost: async (req, res) => {
        const page_path = '/admin/sales-orders/edit-order?clientid=' + req.body.clientid
        try {

            if (req.file) {
                var xmlfile = req.file.originalname
            }

            // Database details
            const sql_1 = 'INSERT INTO mt_sales_orders (client_id, payment_id, sales_order_amount, xml_order) VALUES (' + req.body.clientid + ', ' + req.body.paymentid + ', ' + req.body.paymentamount + ', "' + xmlfile + '");'

            const [rows1, fields1] = await promisePool.query(sql_1)

            const sql_2 = 'UPDATE mt_payments SET sales_order_id= (SELECT sales_order_id FROM mt_sales_orders WHERE payment_id=' + req.body.paymentid + ') WHERE payment_id=' + req.body.paymentid

            // Create database entry
            const [rows2, fields2] = await promisePool.query(sql_2)

            await req.flash($success_msg, 'XML Order added successfully.')
            res.redirect(page_path)            

        } catch (error) {
            await req.flash($error_msg, 'Something went wrong.' + ' ' + error)
        }
    }

I haven't had any issues ever in my local environment and the issue seems to have fixed on live server.

Project is still in development so when I update live server with finished program I'll update if there is an issue.

I also have a custom function set up to do multiple SQL statements in a loop, but I am going to remove it as promise pooling seems to work without issues now.

const pool = require('../models/db/db')

const dbQueryPost = (sql, page_path, res) => {
    pool.getConnection(function (err, conn) {
        const sql_length = sql.length - 1
        sql.forEach((q, i) => {
            conn.query(q, (err, rows, fields) => {
                if (i == sql_length) {
                    res.redirect(page_path)
                }
            })
        })
        // conn.release()
        pool.releaseConnection(conn)
    })
}

module.exports = {
    dbQueryPost
}

And use like this inside controller:

const f = require('./defaultFunctions')

action: (req, res) => {
     const sql_1= 'SELECT * FROM mt_posts'
     const sql_2= 'SELECT * FROM mt_orders'

     const sql= [sql_1,sql_2]

     const page_path = '/posts'
     f.dbQueryPost(sql, page_path, res)
}
ishmum123 commented 3 years ago

The socket it internally uses, has a destroyed property. I was thinking of accessing that and reconnecting when a new query comes in. Please suggest if there's something better than this. Alternatively you guys could expose the property.