lukechampine / walrus

A wallet server for Sia
https://lukechampine.com/docs/walrus
MIT License
12 stars 0 forks source link

Add "batch query" endpoints #10

Closed lukechampine closed 4 years ago

lukechampine commented 4 years ago

The current API design is simple and easy to understand, but horribly inefficient for certain operations. For example, to retrieve the last 100 transactions, you need to make 101 API calls! This is because each transaction (or address, or output, etc.) is a separate resource, and must be queried separately.

There are a few ways we could address this. One of the most popular is the "batch job" approach:

POST /batch { ... query ... } -> returns :id
GET /batch/:id -> returns result of query (or "pending", whatever)

A more exotic option also occurred to me:

GET /transactions?max=5&map=/transactions/

This would query the /transactions endpoint for the 5 most recent transactions, then "map" the resulting ids into GET /transactions/:id requests. Pretty neat, but there's a big problem, which is that you can't pass your own list of IDs to query -- they have to originate in some other API call.

Both of those options are more flexible than what we really need, so here's what I'm currently leaning towards:

POST /batchquery/transactions { ... list of ids ... } -> returns list of /transactions/ responses
POST /batchquery/addresses { ... list of addrs ... } -> returns list of /addresses/ responses
etc.

This approach should work fine, because all of the walrus resources are grouped into hierarchies and have unique identifiers.

Lastly: I think if I could wave a magic wand and convert everything to GraphQL, I'd probably do that. IIUC GraphQL would let clients make custom requests that return exactly what they need, without having to make multiple requests. Unfortunately, no such magic wand exists, so the conversion would take a lot of work.

lukechampine commented 4 years ago

Implemented in https://github.com/lukechampine/walrus/commit/981aaa761b29ec5c37236ffa8bbc8f0aa045fb53. See #11 for follow-up.