Closed maoueh closed 1 year ago
What do we do with maximum results and pagination for requests such as "Scan" and "Prefix" ? Without "time-traversal" properties on the kv store, we will end up with the same design problem as eth rpc calls...
Small analysis of concurrency problems:
on our own 'netkv', I don't see a good solution..
Reopening because the task will be closed when instructions will have been sent to Pinax and our Discord
Also, here a few things that are missing:
[x] Some minor update to README.md:
Note or Warning should be formatted with new GitHub Note/Warning syntax (see raw source of this comment for raw markup):
Note This is an indented note
Needs a paragraph in between
Warning And a warning
(these commands have been tested while running the 'devel/local/start.sh' in another terminal)
should be in a note after the commands. Same thing for browser note (this has been tested while running the 'devel/local/start.sh' in a terminal)
Cargo crate
heading is wrong, it's bigger than others, also, the section should be improve to better explain how to add it and use it, something quicksubstreams-sink-files
tutorial currently in PR can be used@maoueh please review https://substreams.streamingfast.io/developers-guide/substreams-sinks/substreams-sink-kv#consume-the-key-value-data-from-a-web-page-using-connect-web
then we can close this issue :D
All good, just some alignment of code blocks with bullet list, closing anyway so we move on.
Now that we have a Substreams sink to KV (https://github.com/streamingfast/substreams/issues/93), we will know try to have something easier for the developers to consume the KV store.
We are going to slap a gRPC + Connect Web.on top of the
substream-sink-kv
. At the infrastructure level, we expect a single binary to run that will do 2 things:And the payload would simply be an
any.Any
, with the value of the key, and the client would do the decoding, because he knows what he’s querying.While doing this don't forget:
Design should allow these examples substreams to be built