Open hlship opened 7 years ago
One simple approach might be to allow passing an interceptor as an option. That way you can doing any computation you want before the query is parsed. Might be a little too powerful, but it should definitely work.
It doesn't try to solve the problem of persistence, but it leaves it open how you want to fetch the queries. If you need to do it asynchronously, you can just return a channel like any interceptor. Here's a quick proof of concept. Might be better just to make it a function and maybe only pass it a subset of data instead of the whole context.
An interceptor is exactly what I was thinking and basic work to support looking up a parsed query by name, including a hash fingerprint.
Has there been progress made on this? @hlship you mentioned on Slack that you are working on making the interceptor chain more malleable; I assume it will solve this issue?
Since we don't yet use this in the Walmart code, we don't have a good battle-tested solution for this. I'll bring it up internally ... I'd like to see some support for this, but we need some server-side strategy and infrastructure to abstract from.
We've been interating on this internally; the concensus was to have yet another endpoint for stored queries. A full implementation also includes a restful POST endpoint for creating queries, but that's likely beyond what we can supply in library code (as it involves authentication and other details that are very application specific).
It would be nice if there was an overall approach to server-stored query documents. This is a bit tricky, since we need to abstract over how such documents are persisted on the server and exposed in a cluster. But at the root of it, the client should be able to send a request that simply identifies the query by some form of ID, and optionally supply variables and/or operation name. This can greatly reduce the size of request sent to the server and slice a chunk of processing time out of handling the request, when the parsed query is already cached in memory.