signalfx / signalfx-nodejs

Node.js client library for SignalFx (Deprecated)
Apache License 2.0
21 stars 29 forks source link

SignalFxClient queue will grow in size if the average send() datapoints length exceeds batchSize, leaking memory #105

Closed TysonAndre closed 1 year ago

TysonAndre commented 1 year ago

For example, if an application calls SignalFxClient.send() periodically with 400 datapoints and the default batchSize of 300, it will leak memory because at most one HTTP post is made per call to send():

  1. 400 items will get added to signalFxClient.queue
  2. send() will lead to a single call to startAsyncSend(), and 300 (batchSize) items will get removed from signalFxClient.queue, leaving 100 items in signalFxClient.queue
  3. When this is repeated, queue will grow by 100 elements each time, leaking memory

https://github.com/signalfx/signalfx-nodejs/blob/48ce0587390ecfe3145c148339b070bb56d92476/lib/client/ingest/signal_fx_client.js#L231-L243

https://github.com/signalfx/signalfx-nodejs/blob/99d8fb3cd8ca2396ecf42a6a209ae2655bd8611e/lib/client/conf.js#L8

Possible Solutions

  1. After successfully sending data points (promise resolves), call startAsyncSend() again if this.queue.length >= this.batchSize https://github.com/signalfx/signalfx-nodejs/blob/48ce0587390ecfe3145c148339b070bb56d92476/lib/client/ingest/signal_fx_client.js#L292
  2. Introduce a max queue size and log a warning if this is exceeded
  3. Chunk calls into batchSize and call SignalFx concurrently
  4. Allow calls to be made in sizes that are larger than batchSize (This may cause problems if SignalFX endpoints reject http requests where the body size in bytes is too large; I haven't tried this)

This affects 7.x and 8.0.0beta releases (I didn't check earlier release lines)