Open mohit-rathee opened 7 months ago
Messages Typed : Manually (bare hands) Total messages sent : 400 Total time taken : 60 sec No of users present : 1
RESULT : 150ms for each message no lag at all
Messages Typed : Automatically (Selenium) Total messages sent : 1000 Total time taken : 159 sec No of users present : 2
RESULT : 159ms for each message no lag at all
Messages Typed : Automatically (Selenium) Total messages sent : 1000 ( 100 messages by 10 different users) Total time taken : 211 sec No of users present : 10
RESULT : 211ms for each message no lag at all
All tests done till now were performed with single pc and multiple chrome instances. So due to hardware limitations it's stress test is still incomplete. No lag at all is faced by me up until this point.
We tried to do the same stress tests for whatsapp under same network conditions with the help of selenium library. And we found shocking results. It took more than 600 sec to send 1000 messages. (100 per minute) 600ms per message which is 3x slower than web-chat.
May be it is WhatsApp api who provide message delivery in few miliseconds but on browser it's not that fast. or may be whatsapp claims to be fast on its internal network of servers but doesn't guarantees on internet.
great 🔥 So how you tested the api's like how you called the multiple apli's at a same time (by different users) and is the time result is the total time taken by the request or the result is the time taken on the process by the server...?
By some bash scripting and selenium I fired up multiple chrome instance which will create random users into application. And they all send random messages as programmed. code is here Well the total time taken is actually the Time of last msg - Time of first msg (estimates the time taken to process all requests) I need to consider using some other utilities like wscat which can send more and more messages into web-chat without needing to open a chrome instance
Here are the response time of various api's used by web-chat backend server. (measurements are taken from browsers network tab) To send message in a channel : 118 ms To create a channel : 121 ms To give reaction on messages : 113 ms To reply on a message : 115 ms To DM a person : 90 ms To load previous message of a channel (30 messages): 100 ms
The term Load means our browser will send a status of data it has cached about the server and if new data is present then server will only provide with new data. In case of no cache found server will send all the messages, media, users present in server.
To Load (without cache) : 230 ms To Load (with cache) : 100 ms
At the time of testing, sqlite3 database is choosed which is present with backend if we opt to choose a database which is outside our backend then some more latency can be seen
Here are the response time of these api's but from backend perspective. (time is measured by backend server to process each request) To send message in a channel : 12 ms To create a channel : 31 ms To give reaction on messages : 9 ms To DM a person : 1 ms To load previous message of a channel (30 messages): 12 ms
These completely depends on amount to data present on server. To Load (without cache) : 40 ms To Load (with cache) : 11 ms
Channel Creation Why does API take longer? Do you know the reason?
Generally one database operation takes 10 - 15 ms. And to create new channel it need to do 2 of them:
so overall it takes 25 to 30 ms
So, how can we reduce the time? think about it...
when you are free tommorow for the meet ?
The problem we are facing is due to calling of multiple database requests.
In the Load api also we are calling (n+3) requests. where n is number of tables present and remaining 3 for other necessary information.
And in create channel api we fires 2 requests.
So if we can reduce these requests, we can reduce the response time too.
ORM's provide a way to run an sql command too. But if we take advantage of Compound SQL Commands. We can see blazingly fast response time.
Great 🥳, we can try that.
And for the Create Channel process, we will use UUID and ACID operations & ( sync to async func.. ) to reduce the processing time. We will discuss these things again, so schedule a meeting tomorrow when you are free.
Also create another branch for this repo, so that we can easily manage the main branch.
We successfully executed compound SQL (creating new tables in database and updating its details in another table) and it generally took 8 to 9 ms. The another task was to reflect those changes into our ORM:
First Approach : When we used general (reflect/automap) functions to map all changes from database to our ORM then it becomes a linear complexity. As the number of tables increases, time taken by these function also increase. RESULT : Even in just 50 tables the autoload function took 1 sec.
Second Approach : By learning how the sqlalchemy library works by creating classes from metadata requested from database. We optimised our application to manually create classes and metadata in ORM and then just map the class with the metadata of that table. In this way it took constant time for mapping new table. RESULT : Around 4 to 5 ms.
According to this create new channel will take approximately 15 to 17 ms. 🥳 _Note that this experiment is done independently from web-chat. So results from this experiment can be seen anywhere in any project. experiment code
cool 🔥
To send message in a channel : 12 ms now work on this api
Test web-chat under various stress full conditions. Give analysis of experiences under these conditions.