Factom-Asset-Tokens / FAT

Factom Asset Tokens - Open tokenization standards on Factom
17 stars 8 forks source link

Migrate SISO => MIMO in FAT-0 as per #7 #10

Closed drkatz closed 6 years ago

drkatz commented 6 years ago

Transitions single input single output functionality to multi. closes #7

AdamSLevy commented 6 years ago

Additionally it could be better specified what is being signed.

Currently the validation column for RCD and Signature just say "Factoid Transaction Validation" but this doesn't apply here to us.

The factoid transaction data structures are defined as binary protocols. The structure roughly looks like this:

[HEADER] [INPUTS] [OUTPUTS] [EC OUTPUTS] [RCD/SIGNATURES]

To validate a factoid transaction, you verify that the amounts of the outputs are less than the inputs with a difference sufficient to cover the fee and any ec conversions. Then for each input, you verify that the corresponding RCD hashes to the input's address, that the signature is valid for the public key in the RCD, and that the signature is a signature of everything in the tx up to but not including the [RCDs/SIGNATURES] section. No two txs may have the same inputs, outputs, ec ouputs, and timestamp (which is part of the [HEADER]), and that protects against tx replay.

My current understanding is that for our purposes, the signature is supposed to be a signature of the hashed (dbj2a) salt. The spec currently requires that salts are unique across the entire history of transaction for the token to protect against replay attacks.

The Validation column for the RCD and sig needs to be changed to something more accurate and descriptive than Factoid Transaction Validation.

I'll note here that I came up with a new way to protect against replay attacks that removes both the need to verify uniqueness of salts across the entire tx history and the need to track a nonce per address! I will open a new issue to discuss this change.

drkatz commented 6 years ago

Thanks @AdamSLevy, I glossed over this detail while reading the datastructures document for multiple transactions, oops. I think we should stay as close as possible to how Factoid transactions already work since there's already good code that we can use(e.g. the current fat-js implementation currently uses @PaulBernier's Factoid TX datastructure for SISO tx's. Could be a good approach to use the Go library equivalent).

My current understanding is that for our purposes, the signature is supposed to be a signature of the hashed (dbj2a) salt. The spec currently requires that salts are unique across the entire history of transaction for the token to protect against replay attacks.

The salt your referring to is used in place of Header[milliTimestamp] in the Factoid TX since we don't enforce timestamps in FAT (It's deferred to Factom's entry timestamp) just to clarify. So, it's not technically what is signed, but yes it's important for creating unique transactions that aren't so easy to replay.

The Validation column for the RCD and sig needs to be changed to something more accurate and descriptive than Factoid Transaction Validation.

Most definitely. In the Factom datastructures doc it specifies RCD X: It hashes to input X and Signature X: This is the data needed to satisfy RCD X which is quite vague and disappointing (Thanks Inc. :disappointed: ). I'm having a hard time coming up with a more detailed way to put it since I am having a hard time even understanding what this means.

So far I've been reverse engineering and black boxing a lot of this stuff in my implementations for this reason. I can know that the signing currently implemented works by trying to maliciously manipulate fields in the TX and performing validation, which acts as expected (See here). But I have to admit this is not a good way to work up to writing detailed docs. I'm in the process of working through the datastructures doc and some of Luaps code to help fill in the gaps.

drkatz commented 6 years ago

I'll note here that I came up with a new way to protect against replay attacks that removes both the need to verify uniqueness of salts across the entire tx history and the need to track a nonce per address! I will open a new issue to discuss this change.

Very excited to see what you came up with! If we can truly do this it will make everything much more space efficient and performant. Will be awaiting your issue :+1:

AdamSLevy commented 6 years ago

I think we should stay as close as possible to how Factoid transactions already work since there's already good code that we can use

Okay, however the Factoid Transaction Datastructures are binary data structures and we are creating a JSON spec. We need to define that we are mapping from one to the other if we are going to do that. This will require implementations to parse the human readable Factoid Addresses into the raw data of the hash of the RCD. Not hard, just noting this step. And you're right, that is already implemented.

The salt your referring to is used in place of Header[milliTimestamp] in the Factoid TX since we don't enforce timestamps in FAT (It's deferred to Factom's entry timestamp) just to clarify.

Yes I understand. Both the salt (for FATIP-0) and the timestamp (for Factom) are part of prohibiting replay attacks.

So, it's not technically what is signed, but yes it's important for creating unique transactions that aren't so easy to replay.

Well it must be a part of the signed data. Otherwise it isn't doing anything at all for replay protection.

Whatever is signed is not yet well-defined in the FATIP-0 spec. We should spell this out.

In the Factom datastructures doc it specifies RCD X: It hashes to input X and Signature X: This is the data needed to satisfy RCD X ... I am having a hard time even understanding what this means.

I understand. I had some trouble understanding how the Factoid transaction signing worked. I had to go back to the bitcoin wiki page on P2SH and bitcoin scripts. Maybe the following will help you understand the Factoid Transaction spec.

The RCD concept makes a lot of sense in the context of bitcoin pay-to scripts. In case you aren't aware, bitcoin defines a simple, non-turing complete (no loops), stack based, byte code scripting language that defines simple operations and crypto operations. A transaction is valid if it presents the inputs to the pay-to script of the input that result in the script returning true. The most common pay-to script is what you would expect: the script where you must present a signature of the transaction data that can be validated by the public key of the input.

Factom has a similar concept with the Redeem Condition Dataset. Currently only one type is implemented (type 0x01) which is just "present a signature that checks out" like I described above. Theoretically Factom could implement other pay-to-script-like RCDs.

One thing that was not initially obvious to me about Factom Addresses is that what we commonly call the public address (a human readable address starting with FA...) is not the raw data of the cryptographic public address. It is actually the hash of the RCD containing the cryptographic public address, with some header info prep-ended (so that it always starts with FA), and a checksum appended (First four bytes of sha256 of the header+hash).

The raw data of the public key is actually the last 32 of the 33 bytes of the RCD. The first byte is the RCD type (0x01). So you can actually think of the RCD as a script whose opcode is 0x01 and whose arguments are a) the public key (part of the RCD), b) a signature (immediately follows the RCD in a FCT Tx), and c) the input data of the signature (all bytes of the Tx up to but not including the RCD section). For a tx to be valid, the RCDs must all hash to the corresponding input addresses, and the signatures must check out against their respective RCD.

Let me know if any of that doesn't make sense.

AdamSLevy commented 6 years ago

I now understand the field marshaledBinaryData. It wasn't immediately obvious to me that the JSON structure would be converted to the Factoid Transaction Binary data structure. That must be the marshaledBinaryData.

drkatz commented 6 years ago

@AdamSLevy Thanks for taking the time to write this, helps my understanding a bunch