elastic / elastic-transport-js

Transport classes and utilities shared among Node.js Elastic client libraries
Apache License 2.0
5 stars 26 forks source link

Suddenly, elastic transport cannot handle unicode characters. #130

Closed superfan-dobri closed 2 months ago

superfan-dobri commented 2 months ago

💥 Regression Report

I'm not convinced this is an issue with elastic transport yet, but I'm hoping I can get some pointers to figure out what's going on. Did anything change with how elastic transport handles unicode characters? At around 9am central time yesterday, 8/23, multiple elastic clusters across several environments suddently started throwing the following error across all elastic endpoints:

/usr/src/app/node_modules/.pnpm/@elastic+transport@8.7.0/node_modules/@elastic/transport/lib/Transport.js:604
                            : new errors_1.ConnectionError(error.message, result, errorOptions);
                              ^

ConnectionError: Request body length does not match content-length header
    at SniffingTransport._request (/usr/src/app/node_modules/.pnpm/@elastic+transport@8.7.0/node_modules/@elastic/transport/lib/Transport.js:604:31)
    at async SniffingTransport.request (/usr/src/app/node_modules/.pnpm/@elastic+transport@8.7.0/node_modules/@elastic/transport/lib/Transport.js:651:20)
    at async Client.IndexApi [as index] (/usr/src/app/node_modules/.pnpm/@elastic+elasticsearch@8.14.0/node_modules/@elastic/elasticsearch/lib/api/api/index.js:51:12)
    at async reindexModelItem (/usr/src/app/experimental/framework/elastic_util.js:174:5)
    at async updateModelInternal (/usr/src/app/experimental/framework/database.js:67:13) {
  options: { redaction: { type: 'replace', additionalKeys: [] } },
  meta: {
    body: undefined,
    statusCode: 0,
    headers: {},
    meta: {
      context: null,
      request: {
        params: {
          method: 'PUT',
          path: '/game/_doc/22585',
          body: '{ ... valid JSON ... }',
          querystring: '',
          headers: {
            'user-agent': 'elasticsearch-js/8.14.0 (linux 6.1.85+-x64; Node.js 22.7.0; Transport 8.7.0)',
            'x-elastic-client-meta': 'es=8.14.0,js=22.7.0,t=8.7.0,hc=22.7.0',
            authorization: '[redacted]',
            'content-type': 'application/vnd.elasticsearch+json; compatible-with=8',
            accept: 'application/vnd.elasticsearch+json; compatible-with=8',
            'content-length': '1256'
          }
        },
        options: {},
        id: 2770
      },
      name: 'elasticsearch-js',
      connection: {
        url: 'https://index-es-http:9200/',
        tls: {
          ca: { ... },
          rejectUnauthorized: true
        },
        id: 'https://index-es-http:9200/',
        timeout: 30000,
        headers: {
          'user-agent': 'elasticsearch-js/8.14.0 (linux 6.1.85+-x64; Node.js 22.7.0; Transport 8.7.0)',
          'x-elastic-client-meta': 'es=8.14.0,js=22.7.0,t=8.7.0,hc=22.7.0',
          authorization: '[redacted]'
        },
        deadCount: 0,
        resurrectTimeout: 0,
        _openRequests: 0,
        weight: 1000,
        maxEventListeners: 100,
        pool: { _events: {}, _eventsCount: 0, _maxListeners: undefined }
      },
      attempts: 3,
      aborted: false
    },
    warnings: null
  }
}

Node.js v22.7.0

Last working version

Worked up to version: 8.7.0

Stopped working in version: 8.7.0

To Reproduce

Steps to reproduce the behavior: No idea

In the current configuration, this is guaranteed to fail:

await client.index({
  index: "test",
  id: "1",
  document:{ name: "José" },
});

Note other endpoints, such as search, exhibit the same behavior. If a unicode character is present, the request fails with the above error.

Such index calls worked just fine prior to 9am yesterday. This is a self-hosted Elastic cluster running on kube in GKE. As far as I can tell, there have been no changes to the cluster, although the individual elastic nodes have restarted.

Expected behavior

The index is updated and no errors are thrown.

Your Environment

superfan-dobri commented 2 months ago

Well this was fun, and short lived. But yeah, most likely got head-shotted by node@latest dep. There are several buffer / byteLength calculation issues in 22.7.0: https://github.com/nodejs/node/issues/54543#issuecomment-2308461496

JoshMock commented 2 months ago

@superfan-dobri Since this appears to be a Node issue, and the only recent release of the transport doesn't seem like it would have caused this issue, I'm going to close.