I want to implement a side rabbitmq server in my Laravel api, the problem is: I have a lot of requests per second.
I would like to know if there is any way to keep the producer alive.
I did a test locally and found that with each request it creates a new instance.
Here are the results of sending data to the queue then returning a json:
Response:
public function index()
{
Amqp::publish('routing-key', 'message' , ['queue' => 'test']);
return response()->json([
'message' => 'Hi! Are you lost?'
]);
}
ab -n1000 -c10 api.rabbit.test/
This is ApacheBench, Version 2.3 <$Revision: 1843412 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/
Benchmarking api.rabbit.test (be patient)
Completed 100 requests
Completed 200 requests
Completed 300 requests
Completed 400 requests
Completed 500 requests
Completed 600 requests
Completed 700 requests
Completed 800 requests
Completed 900 requests
Completed 1000 requests
Finished 1000 requests
Server Software: nginx/1.17.9
Server Hostname: api.rabbit.test
Server Port: 80
Document Path: /
Document Length: 31 bytes
Concurrency Level: 10
Time taken for tests: 22.400 seconds
Complete requests: 1000
Failed requests: 557
(Connect: 0, Receive: 0, Length: 557, Exceptions: 0)
Non-2xx responses: 557
Total transferred: 2471795 bytes
HTML transferred: 2246189 bytes
Requests per second: 44.64 [#/sec] (mean)
Time per request: 224.000 [ms] (mean)
Time per request: 22.400 [ms] (mean, across all concurrent requests)
Transfer rate: 107.76 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 0 0.0 0 1
Processing: 75 222 74.8 210 724
Waiting: 75 222 74.8 210 724
Total: 75 222 74.8 210 724
Percentage of the requests served within a certain time (ms)
50% 210
66% 237
75% 252
80% 263
90% 306
95% 355
98% 439
99% 500
100% 724 (longest request)
Here are the results just returning a json:
Response:
public function index()
{
return response()->json([
'message' => 'Hi! Are you lost?'
]);
}
ab -n1000 -c10 api.rabbit.test/
This is ApacheBench, Version 2.3 <$Revision: 1843412 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/
Benchmarking api.rabbit.test (be patient)
Completed 100 requests
Completed 200 requests
Completed 300 requests
Completed 400 requests
Completed 500 requests
Completed 600 requests
Completed 700 requests
Completed 800 requests
Completed 900 requests
Completed 1000 requests
Finished 1000 requests
Server Software: nginx/1.17.9
Server Hostname: api.rabbit.test
Server Port: 80
Document Path: /
Document Length: 31 bytes
Concurrency Level: 10
Time taken for tests: 14.362 seconds
Complete requests: 1000
Failed requests: 0
Total transferred: 280000 bytes
HTML transferred: 31000 bytes
Requests per second: 69.63 [#/sec] (mean)
Time per request: 143.620 [ms] (mean)
Time per request: 14.362 [ms] (mean, across all concurrent requests)
Transfer rate: 19.04 [Kbytes/sec] received
Connection Times (ms)
min mean[+/-sd] median max
Connect: 0 0 0.0 0 1
Processing: 68 122 104.8 102 2337
Waiting: 68 122 104.8 102 2337
Total: 68 122 104.8 102 2338
Percentage of the requests served within a certain time (ms)
50% 102
66% 109
75% 117
80% 125
90% 156
95% 193
98% 380
99% 577
100% 2338 (longest request)
I get a lot of requests per second, these extra milliseconds of waiting for publication can create a bottleneck and it would be extremely important to leave this connection active.
Is there any way to keep the connection alive at the producer?
This library does not support keeping the connection alive nor does php itself right now.
The Producer is alive as long as the AMQP Facade get destructed.
I want to implement a side rabbitmq server in my Laravel api, the problem is: I have a lot of requests per second.
I would like to know if there is any way to keep the producer alive.
I did a test locally and found that with each request it creates a new instance.
Here are the results of sending data to the queue then returning a json:
Response:
Here are the results just returning a json:
Response:
I get a lot of requests per second, these extra milliseconds of waiting for publication can create a bottleneck and it would be extremely important to leave this connection active.
Is there any way to keep the connection alive at the producer?
Thanks