ably / ably-python

Python client library SDK for Ably realtime messaging service
https://ably.com/download
Apache License 2.0
47 stars 24 forks source link

Load testing against httpx #554

Open sacOO7 opened 5 months ago

sacOO7 commented 5 months ago

Do load testing on a dedicated server using

┆Issue is synchronized with this Jira Task by Unito

sync-by-unito[bot] commented 5 months ago

➤ Automation for Jira commented:

The link to the corresponding Jira issue is https://ably.atlassian.net/browse/SDK-4077

sacOO7 commented 5 months ago

We can either write our own load testing script https://medium.com/@yusufenes3494/how-to-build-your-own-load-test-a-step-by-step-guide-1a8367f7f6a2 Or One of the way is to clone python-locust project and modify https://github.com/locustio/locust/blob/master/locust/clients.py file to use httpx client instead of python-requests. This will help us to use all features of locust client like result tracking on web UI, without writing explicit code for the same.

sacOO7 commented 5 months ago

We will need a dedicated ably RDP server to run this script against an ably limitless account : )

sacOO7 commented 5 months ago

I don't see any point in generalizing this load tester client, since this is a library specific issue. Most of the times, http client libraries are stable and they are not required to be tested. Load testing is specifically done to test servers under load and not client itself.

sacOO7 commented 5 months ago

Also, it doesn't make sense to load test clients across different SDK's since every client is written in different language and we will need to write the script for the same, every language has different metrix in terms of performance. Currently, we can just focus on ably-python and check if it works as expected.

sacOO7 commented 4 months ago

Load testing TPS graph given at https://blog.devgenius.io/10-reasons-you-should-quit-your-http-client-98fd4c94bef3

sacOO7 commented 4 months ago

Test with both servers

  1. Local HTTP server
  2. Ably `https://ably.rest.io/time

Test with requests and niquests -> SingleTon Run in both sync/async mode

sacOO7 commented 4 months ago

Created a separate load test repo. to run load test using locust.

requests_vs_httpx_rps

Executed GET request against https://rest.ably.io/time with 10 users using singleton instance of requests and httpx

  1. Left side first half graph (python requests with http1.1) is very stable with low bumps (upto 140ms) with average repsonse time of 70ms
  2. Middle to right side graph (httpx with http2) is having lots of bumps (upto 200ms) with average repsonse time of 76ms

PS. Average response time for browser request to https://rest.ably.io/time is ~66ms.

Test conducted on intel i7-11800h, 16 gb ram windows machine.

sacOO7 commented 4 months ago

For 100 users, httpx is literally crying here with several bumps wrt number of requests made.

httpx_crying

Comparing it with python-requests, we get much stable graph with lowest possible latency for requests made

python-requests-rocking

sacOO7 commented 4 months ago

For python-requests => pool_connections =pool_maxsize = 100 and for httpx => httpx.Limits(max_keepalive_connections=100, max_connections=100, keepalive_expiry=120)

For 50 users, httpx is showing bumps again with several bumps wrt number of requests made. Average response time is ~90ms.

httpx_50_connections

Comparing it with python-requests, we get much stable graph with lowest possible latency for requests made Average response time is ~70ms.

python-requests-50-users

sacOO7 commented 4 months ago

@ttypic Until we get proper resolution from httpx, we can suggest documentation https://www.python-httpx.org/advanced/#pool-limit-configuration. Devs can adjust this limits according to their load requirements. This doesn't guarantee full stability though will reduce spikes in requests made.