twmb / franz-go

franz-go contains a feature complete, pure Go library for interacting with Kafka from 0.8.0 through 3.6+. Producing, consuming, transacting, administrating, etc.
BSD 3-Clause "New" or "Revised" License
1.6k stars 158 forks source link

Optimistically cache mapped metadata when cluster metadata is periodically refreshed #725

Closed pracucci closed 1 month ago

pracucci commented 1 month ago

In Mimir, we have a process creating 1 Client (setting MetadataMinAge = MetadataMaxAge = 10s`) and then using it to do two things:

  1. Consume a partition
  2. Request ListOffsets every 1s

I noticed that such Client issues 2 Metadata requests every MetadataMinAge (remember that we set MetadataMaxAge = MetadataMinAge).

One request is due to the periodic Metadata refresh, and one is due to ListOffsets which need Metadata as well. ListOffsets request calls Client.fetchMappedMetadata() which caches the metadata for MetadataMinAge, so effectively leading to 2 Metadata requests every MetadataMinAge.

In this PR I'm proposing to optimistically cache the mapped metadata when cluster metadata is periodically refreshed.

Notes

twmb commented 1 month ago

pt2, "Do you see any issue" -- probably not. I currently take deliberate care within the client to not go through the end-user Request path, to not have this secondary caching for the general consumer/producer. Of the 7 requests that use fetchMappedMetadata, 5 are admin requests, and the other two are. ListOffsets and OffsetForLeaderEpoch. These latter two are only used while consuming, and I have a fairly large chunk of code in consumer.go dedicated to issuing these requests directly (and handling the responses in a paranoid way). So, the change here to always pipe producer/consumer metadata responses through the separate fetchMappedMetadata cache will increase memory.

I'll look through the code change / callback more when I get time soon; the past two months have been busier than I was expecting, but getting through issues & PRs is actually at the top of the priority list at the moment.

twmb commented 1 month ago

Good enough, we'll see if somebody even notices a minor memory bump. +1!

pracucci commented 1 month ago

Thanks for the review! Feel free to revert (or ping me) if anyone complain about it.