llucax / python-grpc-benchmark

A simple benchmark comparing Google's protobuf/grpc.aio with betterproto/grpclib
MIT License
3 stars 1 forks source link

Improper benchmarking of throughput #2

Open cathaypacific8747 opened 4 months ago

cathaypacific8747 commented 4 months ago

Hi,

Thanks for the benchmark! I noticed that the benchmark uses timeit, and each iteration repeatedly creates a new gRPC channel:

https://github.com/llucax/python-grpc-benchmark/blob/94a175fbcbb7d144506b5f5099c8b87e9f21c658/benchmark#L21

https://github.com/llucax/python-grpc-benchmark/blob/94a175fbcbb7d144506b5f5099c8b87e9f21c658/grpcio/client.py#L9-L12

However, more realistic scenarios would involve creating a single channel and reusing it for multiple RPC calls. I've forked the repo and and re-ran the benchmarks for multiple payload sizes:

It seems like grpc.aio (blue) actually shows significantly better performance compared to grpclib (orange) for single calls, but slightly poorer performance for large streaming requests.

time_say_hello time_stream_numbers

Test conditions: i5-13600K, 32GB RAM, Python 3.11, 1000 loops

llucax commented 4 months ago

Very interesting, thanks a lot for sharing!

fighterhit commented 3 months ago

The official documentation has mentioned in the performance best practices section that unary RPCs in Python have higher performance than streaming RPCs because there are extra threads https://grpc.io/docs/guides/performance/#python

image