Changes the endpoint that we send events to the batch endpoint.
The previous flow of events was like this:
event
queue
threads * max_concurrent_batches
send event to events endpoint and push response to responses queue
we now have the following flow:
event
queue
thread
read event off queue
add to hash keyed on dataset, apikey, and apihost
every send_frequency
take the hash and split each set of events into groups of max_batch_size
push each batch onto queue
threads * max_concurrent_batches
serialise each event to JSON, send errors to responses queue
send each serialised batch to batch endpoint
send errors to the responses queue
read response and push a response onto the responses queue for each event status returned in the response
Benchmarking
Using the following code, to send 1000 events using both the current version of libhoney and this new method
require "benchmark"
require "libhoney"
libhoney = Libhoney::Client.new(writekey: "...", dataset: "libhoney-benchmark")
b = Benchmark.measure do
1000.times do
event = libhoney.event
event.add_field("look", "here")
event.send
end
libhoney.close(true)
end
puts b
Changes the endpoint that we send events to the batch endpoint.
The previous flow of events was like this:
max_concurrent_batches
we now have the following flow:
send_frequency
max_batch_size
max_concurrent_batches
Benchmarking
Using the following code, to send 1000 events using both the current version of libhoney and this new method
Current version:
New version: