Closed maxsam4 closed 2 years ago
How exactly would the dependent_on
field be passed in? Included as a field in the transaction itself?
Hey @IvanTheGreatDev Thanks for taking interest. To answer your question, yes, dependent_on
will be a field in the transaction itself just like gasPrice
and nonce
are.
This can even replace nonce's functionality (some of it, at least) if we force every transaction to include the account's last transaction's hash in dependent_on
. This will mean that we will need to make dependent_on
an array of tx hashes so that it can serve the nonce's purpose as well as its own purpose.
I like this idea, since tx hash is something that you get immediately after composing a transaction, so it can be all computed at once? I can see an interesting scenario where 1 transaction, can have multiple follow-on transactions (lets call them a transaction spread, with different gas prices) all sharing the same dependent_on flag. Would this be allowed? And can this be executed all at once, potentially requiring the user (of a wallet) to approve all of those in one batch, and not having to need to check and resubmit, as this spread can be crafted to reasonably guarantee timely execution. (via a range of gas prices)
Could you provide some examples please, as it's useful to step back and think as an end user of this.
There are many scenarios where you want a transaction to be mined strictly after another transaction [...] I will give specific examples in when I properly restructure this. Just looking for opinions right now to decide if it's worthy of an EIP.
This is definitely an interesting idea and I'd love to see a fleshed out EIP 👍
I can see an interesting scenario where 1 transaction, can have multiple follow-on transactions (lets call them a transaction spread, with different gas prices) all sharing the same dependent_on flag.
There's a lot of hidden complexity in moving from independent transactions to transaction DAGs, and I'm curious to see where the scope boundary gets drawn. Since we're essentially talking about batching, it would be good to see batching and chaining separately described, or at least a full semantics of how they're expected to interact (if at all).
For example, a description of what happens when one transaction in the same batch generation fails would be helpful, or if there's a strict one-for-one chaining dependency (ie: each transaction is isolated and the ordering is the only control mechanism). Thinking out loud, you may be able to specify that behaviour in a similar way to how Erlang declaratively sets supervision strategies (official docs here), but adjusted for this use case.
A quick and dirty example { depends_on: 'abc', fail_strategy: 'one_for_all' }
.
At risk of being out in left field but FWIW: we've been looking at an adjacent problem that your EIP may be a good generalization of: multi-transaction "sessions" (esp. pertaining to an ERC1066-to-web bridge). Our current model involves a server intermediary, and ideally output "pipes" like #1287 in addition to event
s. We hadn't considered doing the ordering directly by the network.
Often these transactions have not only an ordering dependency, but require some state threading and/or control logic (ie: if return A
then transaction B
else transaction C
). I suppose based on the rough description, you'd set a flag on a control contract and the next transaction(s) would switch their behaviour based on that?
I like this idea, since tx hash is something that you get immediately after composing a transaction, so it can be all computed at once? I can see an interesting scenario where 1 transaction, can have multiple follow-on transactions (lets call them a transaction spread, with different gas prices) all sharing the same dependent_on flag. Would this be allowed? And can this be executed all at once, potentially requiring the user (of a wallet) to approve all of those in one batch, and not having to need to check and resubmit, as this spread can be crafted to reasonably guarantee timely execution. (via a range of gas prices)
Allowing multiple dependent_on
or having multiple transactions depend on a single transaction shouldn't be a problem imo. Batching transactions into a single transaction/ signature request is a different think though and out of the context of this ERC. Also, I don't see how dependent_on
will allow you to specify a range of gas prices. The highest gas price transaction will always get executed first and if you make it dependent on the lower price tx then the lower price tx will always get executed first. I'll soon write up detailed examples of where I want to use Sequentially Queued Transactions to improve UX and hence adoption.
There's a lot of hidden complexity in moving from independent transactions to transaction DAGs, and I'm curious to see where the scope boundary gets drawn. Since we're essentially talking about batching, it would be good to see batching and chaining separately described, or at least a full semantics of how they're expected to interact (if at all).
Implementing batching is a much more complex task than just implementing dependent_on
. It was not my intent originally to support batching but it's something worth thinking about. The requirements I have are getting solved by a simple Queue.
Batching transactions into a single transaction/ signature request is a different think though and out of the context of this ERC.
There's a lot of hidden complexity in moving from independent transactions to transaction DAGs, and I'm curious to see where the scope boundary gets drawn.
Agree with you both, might be possible to give impression of batching in the UI but still use discreet standalone transactions, so shouldn't be in scope of this.
I don't see how dependent_on will allow you to specify a range of gas prices. The highest gas price transaction will always get executed first and if you make it dependent on the lower price tx then the lower price tx will always get executed first.
The #599 would then come in to solve my problem, to be able to make a tx with valid_from_block and valid_until_block. I can stagger these discreet transactions with for example a 10% increases in gas cost, and achieve the desired result. A wallet UI can then present and sign multiple transactions as one, or some other clever display, to show that you are about to sign mostly duplicates, but with a different range of gas prices/blocks.
But back to the topic of Dependent transactions, looking forward to hear examples.
There are a few applications of dependent_on
in my mind but as I am currently on GovBlocks, I will start with the application of dependent_on
in GovBlocks.
When a user onboards the GovBlocks platform, we deploy a set of contracts for them to Manage the DAO. The contracts are deployed using a factory and proxy technique so the user only needs to do a single transaction to onboard. However, We also have to add some default data to the contracts so that the user can use the DAO from get go. To ease UX, we add the default data from backend but the user is unable to use the DAO till the transactions are mined. If we use dependent_on
, we can allow the user to use the GovBlocks platform and do transactions even before the transactions we sent are mined. We will just add our transaction's hash in the dependent_on
of the user's transactions. This will also allow us to use a moderate amount of gas rather than a ridiculous amount to have the transaction mined fast so that the user can actually use the platform.
There are more applications of dependent_on
in GovBlocks itself. Once on the GovBlocks platform, users can create proposals and then submit solutions and then cast votes. Using dependent_on
, we can allow users to skip the wait time for mining of the transactions before doing the transactions for next steps. This will greatly increase UX, specially when the users trust the original transaction sender (which is often).
Maybe we can add something that will render all the transactions invalid that are dependent_on
on a tx hash of a transaction that has been modified.
In summary, dependent_on
can greatly improve user experience and can potentially lower the cost of an average transaction. I will keep adding more details as I get time and eventually make a draft EIP :).
+1 for this idea
There are also times when wallet software will queue up a bunch of transactions with sequential nonces. If the first transaction gets lost or has too low of a gas fee, then it won't be mined. The user will then have to return to the wallet software and replace that transaction. Only then will the next transactions be mined. This is AWFUL in almost every scenario.
If the ordering did NOT matter, then this prevents the non-dependent transactions from being mined.
If the ordering DID matter, then it's probable that you don't want the queued transactions to run if you are now cancelling the original transaction. This is even worse for financial applications where you might have been taking a trade that is no longer favorable. Given this scenario, the only recourse may be to permanently remove all of your Eth from the wallet with the replacement transaction. This also may be a problem if you have other ERC20 tokens still in that wallet.
Therefore I also propose an additional blocking
field which can prevent other transactions from the same address from being mined. That way you can cancel pending transactions much like you can today using the same nonce. Either that, or you can keep the nonce but it no longer has to be in order and just blocks other transactions of the same nonce.
Thanks for your feedback @BrendanChou ! I couldn't make time to push this EIP/ERC further earlier but I'll try to get a formal PR done by Christmas. Meanwhile, I would like to tackle some of your concerns.
There are also times when wallet software will queue up a bunch of transactions with sequential nonces. If the first transaction gets lost or has too low of a gas fee, then it won't be mined. The user will then have to return to the wallet software and replace that transaction. Only then will the next transactions be mined. This is AWFUL in almost every scenario.
If the ordering did NOT matter, then this prevents the non-dependent transactions from being mined.
For transactions originating from a single account, we will have to maintain the nonce and keep transactions sequential. One of the main purposes of the nonce is to prevent replay attacks. Even if we find an alternative solution, it will be too big of a change for too little gains IMO.
If the ordering DID matter, then it's probable that you don't want the queued transactions to run if you are now cancelling the original transaction. This is even worse for financial applications where you might have been taking a trade that is no longer favorable. Given this scenario, the only recourse may be to permanently remove all of your Eth from the wallet with the replacement transaction. This also may be a problem if you have other ERC20 tokens still in that wallet.
Even now, you can cancel all your pending tx. You don't need to move all the ether from your account. You can just overwrite them with a tx sending 0 ether somewhere. Yeah, it will cost ~21000 gas/tx but that's still cheap.
Therefore I also propose an additional
blocking
field which can prevent other transactions from the same address from being mined. That way you can cancel pending transactions much like you can today using the same nonce. Either that, or you can keep the nonce but it no longer has to be in order and just blocks other transactions of the same nonce.
If this EIP is accepted, you will be able to mark the new transaction as dependent on the original transaction so that if the original transaction is canceled, the new transactions will not be mined. I don't see the need for a separate blocking
field.
I was thinking and writing about this over the holidays but more I wrote about the practical implementation of this and alternative solutions, more I realized that I am just contradicting my motivation. Now, I am not so sure if implementing this is a good Idea.
The practical implementation that I thought of was: Maintaining a mapping of tx hash -> stack of tx dependent on it and dividing mempool into active and waiting mempool.
This algo should work but it increases the complexity on nodes. I don't have the numbers but I am in the dilemma if this is worth implementing.
Ethereum 2.0 is coming soon™ that will make alternative layer 2 solutions of this more feasible. It will probably also kill the motivation of lowering gas price as due to the large capacity, gas price should anyway be lowered. It might also make it hard to implement as it might force node's to maintain dependency of inter shard transactions.
As a result, I am pausing my work on this till anyone else proposes a better implementation.
There has been no activity on this issue for two months. It will be closed in a week if no further activity occurs. If you would like to move this EIP forward, please respond to any outstanding feedback or add a comment indicating that you have addressed all required feedback and are ready for a review.
This issue was closed due to inactivity. If you are still pursuing it, feel free to reopen it and respond to any feedback or request a review in a comment.
Currently, Miners are free to choose the transactions to mine first. This leads to surprises when a transaction you expected to be mined first gets mined later. This often happens when the first transaction requires more gas and can not be fit in an already half full block but the second transaction can (with lower gas price but different tx.origin).
There are many scenarios where you want a transaction to be mined strictly after another transaction but don't want to wait for the first first transaction to be mined before queuing the next transaction. I will give specific examples in when I properly restructure this. Just looking for opinions right now to decide if it's worthy of an EIP.
I suggest that we add
dependent_on
field to the transactions. This field will contain a tx hash of a transaction (or an array of tx hashes?) that MUST be mined before this transaction. The transactions containing this field should also be allowed to be included in the same block as the dependent_on transaction given that the dependent_on transaction is executed first.This will allow devs to force sequential execution of transactions originating from different accounts reliably. This will also allow devs to use lower gas price for non critical transactions that they want to be mined in order. When combined with https://github.com/ethereum/EIPs/pull/599 , this will give even more control over the queued transactions. This can also potentially solve https://github.com/ethereum/EIPs/issues/1441
EDIT: Adding a use case There are a few applications of
dependent_on
in my mind but as I am currently on GovBlocks, I will start with the application ofdependent_on
in GovBlocks.When a user onboards the GovBlocks platform, we deploy a set of contracts for them to Manage the DAO. The contracts are deployed using a factory and proxy technique so the user only needs to do a single transaction to onboard. However, We also have to add some default data to the contracts so that the user can use the DAO from get go. To ease UX, we add the default data from backend but the user is unable to use the DAO till the transactions are mined. If we use
dependent_on
, we can allow the user to use the GovBlocks platform and do transactions even before the transactions we sent are mined. We will just add our transaction's hash in thedependent_on
of the user's transactions. This will also allow us to use a moderate amount of gas rather than a ridiculous amount to have the transaction mined fast so that the user can actually use the platform.There are more applications of
dependent_on
in GovBlocks itself. Once on the GovBlocks platform, users can create proposals and then submit solutions and then cast votes. Usingdependent_on
, we can allow users to skip the wait time for mining of the transactions before doing the transactions for next steps. This will greatly increase UX, specially when the users trust the original transaction sender (which is often).Maybe we can add something that will render all the transactions invalid that are
dependent_on
on a tx hash of a transaction that has been modified.In summary,
dependent_on
can greatly improve user experience and can potentially lower the cost of an average transaction. I will keep adding more details as I get time and eventually make a draft EIP :).Lower cost of average transaction might not sound such a big deal but it is in my opinion a significant thing. How much can this change actually reduce the average cost is a different debate.