Open Arjentix opened 7 months ago
@Stukalov-A-M, @mversic , @Mingela , @0x009922 , your input will be valuable here as well
There might be a huge caveat with transaction signatures. It is based on a SCALE-encoded transaction payload. If clients operate with JSON only, it means the only way for them to sign a transaction is to work with its JSON payload. Therefore, Iroha will also have to work with JSON representations of transactions all the time. AFAIK, it is a huge performance cost.
Signing a transaction anyway will be kind of hard for basic users, so we could still sign in using SCALE, beacause for signing they anyway have to use some pre-built tool.
From the user side it will look kind of like this:
{
"payload": {
"instructions": [
"Mint": {
"id": "rose#alice@wonderland",
"object": "1_u32"
}
],
"timestamp": 132435435,
// ...
},
"signature": "unreadable-signature-from-some-tool"
}
I had a discussion with @Stukalov-A-M and now we think that using one of the universal formats like bson
or ubjson
will be a good fit for transaction signature.
Pros:
Cons:
So a user will submit a transaction in a form described in my previous comment, where signature will be a transaction payload encoded with e.g. bson
and signed with the user's private key.
On the Iroha side we will decode JSON payload into bson
and check if singature is valid. After that it should be enough to gossip just bson representation and signature, which should be pretty ok in terms of performance.
I don't understand how it will allow us to use smt like postman if user anyway have to add custom signatures and use two encodings.
I don't understand how it will allow us to use smt like postman if user anyway have to add custom signatures and use two encodings.
From the point of basic user nothing changes indeed, because they still will need some tool to sing a transaction.
From the point of SDK developer things will be much easier:
bson
is far better supported for different languages than parity SCALE Let me try to be concise and form a point based feedback:
Postman
. Though as shown in the discussion it won't eliminate the need of having a tool to sign stuff. Why not to create a tool mapping JSON to SCALE and back instead then? And basically a user is expected to use an SDK or CLI which provides "human readability".And obviously including JSON support (or replacing SCALE) would incur enormous effort demand which is unacceptable at the current project stage and roadmap. In the future it might be interesting to implement a pluggable serialization but I don't see sufficient motivation to include that in the nearest months for sure.
Why not to create a tool mapping JSON to SCALE and back instead then?
This question was raised many times already in different forms. For example, here:
We could create a web helper which works with the data model schema and helps users build any structure they want and encode it, or decode a bunch of bytes and see what's inside. We are planning to implement it in post-release, and publish it directly in Iroha 2 online docs.
We could also create a lower-level tool which will work exactly with JSON format, even in web (AFAIK iroha_data_model
is no_std
, therefore might be wrapped into WebAssembly).
As for signing data online, it is also possible to do with iroha-javascript
.
@0x009922 looks like a strong argument to wrap up the discussion
There might be a huge caveat with transaction signatures. It is based on a SCALE-encoded transaction payload. If clients operate with JSON only, it means the only way for them to sign a transaction is to work with its JSON payload. Therefore, Iroha will also have to work with JSON representations of transactions all the time. AFAIK, it is a huge performance cost.
I don't understand how it will allow us to use smt like postman if user anyway have to add custom signatures and use two encodings.
we can make a switch to JSON easily but keep using SCALE encoding for signatures. This will make payload readable with postman, signatures are an array of bytes and unreadable anyways.
I'm not sure how big of a concern is performance, but I don't think it's important for client - server communication. I would prioritize readability of JSON in this case.
I'm not fan of combined approach because it brings additional complexity to the system.
I would say that for peer-to-peer communication we definitely want to have serialization format with schema like SCALE. The same goes for anything that we store on the chain and on the disk. For this reason sumeragi
and kura
should keep using SCALE
As for the signatures. Does format of encoding affect signature size? I don't think it does, so from this point that is not a concern and we could use other encoding formats for signatures
Second concern is performance. Naturally, schemaless serialization formats will be slower to encode or decode. Do we want to pay this price when signing a transaction? I would say it's ok for client-server communication to pay this price, but when signing blocks by peers, I think we would want to use serialization format with schema
Overall, unless you think that performance hit of using schemaless serialization format is tolerable, I think that signing transactions should keep using SCALE. However, I'm quite sympathetic to the idea that we should use schemaless serialization between client and server and make it easier to serialize/deserialize. Yes, this would mean that we would use 2 serialization formats but I don't find this to be so bad. Why do we even derive Serialize
/Deserialize
for Transaction
?
To sum up on the discussion in the meeting we had today:
parity_scale_decoder
as an external process from their own program. parity_scale_decoder
should be enhanced so that it supports encoding/decoding of SCALE to/from JSON. This will be handled in a separate issueFindPermissionTokenSchema
query)OpenAPI
via crate utopia
but the actual choice remains to be determined
Originally proposed by @Stukalov-A-M.
Parity scale codec makes it very inconveneint and almost impossible to use tools like
Postman
to interact with Iroha peer. While having this ability would much simplify UX and Iroha attractiveness.Another pros is that it very simplifies Torii API documentation. All we need is a openapi-compatible description of our types, which should not be a problem with already existing crates e.g. utopia. This also makes our
schema
crate useless and worth removing IMO.With simple and human-readable Torii API (not like now) we also open the doors to external SDK devs. E.g. if someone would like to create Swift SDK, they would need no more info than just Torii API docs.