Closed julian-sotec closed 10 months ago
Hi,
PingAndWarm requests are sent once per channel during channel creation. Which happens when the client is initialized and every 50 mins there after. The purpose of PingAndWarm is to minimize perceived latency of a cold connection.By sending an RPC on each channel we ensure that it is established and all of the caches along the way have been warmed.
The lifecycle of the ChannelPool is tied to the client, which should be tied to life of your application process and should not be re-created per request. So to avoid 10x blow, please make sure you keep the client alive. 1 rpc per channel / 50 mins should not have a meaningful impact on a service thats expected to handle 10k qps per node.
If there is something that I'm missing and there is a usecase for shortlived clients or disabling channel priming, please reopen the ticket.
Thanks
Environment details
Steps to reproduce
Code example
External references such as API reference guides
Any additional information below
We are using a bom based approach which keeps the google java client libraries in sync across all used services (Datastore, Bigquery, Cloud Storage, Bigtable, Cloud Tasks) The specific method that we use to ingest data to bigtable is bulk mutate rows.
After updating & deploying we noticed that the bigtable api requests were 10 times higher compared to the previous deployed version. We narrowed it down being a problem starting with version 2.10.0 of the bigtable API client for Java https://cloud.google.com/bigtable/docs/release-notes#August_01_2022 It seems to be related to PingAndWarm requests being introduced before each interaction with bigtable in that specific version. Impact: 300% Latency increase on bigtable operations --> 50% More AppEngine Instances Spawned --> 40 - 50% more cloud costs.
We now decided to downgrade to Version 2.9.0 - but we are no longer in sync with other google client libraries since this is a manual maintenance step.
Because of this we want to understand:
Any insights on this topic is much appreciated.