Blizzard / node-rdkafka

Node.js bindings for librdkafka
MIT License
2.1k stars 390 forks source link

Segmentation fault and build errors using node16 and OpenSSL #1001

Open tb-tatewilks opened 1 year ago

tb-tatewilks commented 1 year ago

OS: Name: Amazon Linux Version: 2 ID Like: centos rhel fedora Architecture: x86_64

Installed packages (yum): openssl.x86_64 openssl-devel.x86_64 openssl-libs.x86_64 openssl11.x86_64 openssl11-libs.x86_64 openssl11-pkcs11.x86_64 gcc.x86_64 gcc-c++.x86_64 gcc-gfortran.x86_64 lz4.x86_64 cyrus-sasl-lib.x86_64 cyrus-sasl-plain.x86_64 make.x86_64 zlib.x86_64 zlib-devel.x86_64

Dependencies: OpenSSL: 1.0.2k-fips AWS CLI: 2.9.17 Node: v16.19.0 Python: 3.7.16

Node process versions

{
  node: '16.19.0',
  v8: '9.4.146.26-node.24',
  uv: '1.43.0',
  zlib: '1.2.11',
  brotli: '1.0.9',
  ares: '1.18.1',
  modules: '93',
  nghttp2: '1.47.0',
  napi: '8',
  llhttp: '6.0.10',
  openssl: '1.1.1s+quic',
  cldr: '41.0',
  icu: '71.1',
  tz: '2022f',
  unicode: '14.0',
  ngtcp2: '0.8.1',
  nghttp3: '0.7.0'
}

My producer code:

import * as rdkafka from 'node-rdkafka';
import * as fs from 'fs';
import * as path from 'path';

const brokerString: string = 'broker-1:9094,broker-2:9094,broker-3:9094';
const topic: string = 'topic1';
const event = {
    ...fields
}

const producer: rdkafka.Producer = new rdkafka.Producer({
    'metadata.broker.list': brokerString,
    'debug': 'all',
    'client.id': 'testing',
    'dr_cb': true,
    'security.protocol': 'ssl',
    'ssl.ca.pem': fs.readFileSync(path.resolve(__dirname, '../certs/client.signed.crt'), 'utf-8'),
    'ssl.certificate.pem': fs.readFileSync(path.resolve(__dirname, '../certs/client.crt'), 'utf-8'),
    'ssl.key.pem': fs.readFileSync(path.resolve(__dirname, '../certs/client.key'), 'utf-8'),
    'ssl.key.password': 'password'
});
console.log(`Created producer >>>> ${producer}`);

const publishEvent = () => {
    console.log(`Producer started at ${new Date().toISOString()}`);
    const eventString: string = JSON.stringify(event);
    try {
        producer.produce(topic, undefined, Buffer.from(eventString), undefined, Date.now());
    } catch (e) {
        console.error(e);
    }
}

producer.connect();
producer.on('ready', publishEvent);
producer.on('event.error', (e) => { console.error(e); });
producer.on('event.log', (l) => { console.log(l); });

I've been trying to use this package in AWS, but building and trying to connect results in a segmentation fault due to the conflicting versions of openssl that are in use (node using v1.1.1 and node-rdkafka being built with v1.0.2). I've upgraded openssl to version 1.1.1 to hopefully avoid the segmentation fault, but then the build fails with a plethora of node-gyp build errors.

I've tested multiple packages for interacting with Kafka using node and this one far outpaces the others, but the build issues and segmentation faults are holding me back from being able to implement. I'd consider upgrading the package to build correctly against openssl1.1.1 would be a worthwhile improvement, since more modern systems will be using later versions.

If this has been resolved already, how do I bypass these issues and build correctly? It's a requirement in my org to use TLS when sending dat to Kafka, so I can't just ignore the openssl issues. I also don't want to downgrade my node version to where the openssl node process is compatible with node-rdkafka because everything else in our architecture runs on node16+.

robertlight commented 9 months ago

Did you ever find a solution to your "ssl error" ?? I am trying to figure out what version of node and what version of rdkafka I should be using to avoid the ssl problem.