regen-network / regen-ledger

:seedling: Blockchain for planetary regeneration
https://docs.regen.network
Other
214 stars 103 forks source link

Extend Credit Batch functionality to allow dynamic minting #924

Closed robert-zaremba closed 2 years ago

robert-zaremba commented 2 years ago

Summary

Add possibility to mint new credits once the batch is created.

Problem Definition

Currently, Batch issuance and initial allocation is defined during Batch creation, and it's not possible to create new credits after that. If we would like to apply this model for a bridge use case, then it will create additional fragmentation and negatively draw on user experience. Let's consider the following scenario. Alice transfers 1 X-TCO2 token (which represent a specific project and vintage, X is and ERC20 smart contract representing that ) to Regen Ledger. 1h later, she is happy to see that everything worked and transfer 100'000 tokens from the same X-TCO2. 2h later Bob transfer 50'000 of same tokens to Regen ledger. While all that tokens were fungible in Polygon, in Regen, in the current model, it would create 3 batches, and the resulting credits would not be fungible between the batches.

Proposal

  1. Create a rpc BatchMint(MsgBatchMint) returns (MsgBatchMintResponse) method. Only Credit Class issuer can mint new credits. In practice, authz will be used to authorize other account (a bridge account) to mint. Authorization will have a credit amount limit.

    message MsgBatchMint {
      // address authorized to issue new credits in a given batch, signer of the msg
     string issuer = 1;
     // issuance are the credits issued in the batch.
     repeated BatchIssuance issuance = 2;
     OriginTx origin_tx= 3;
     // reference note for accounting, will be passed to an event
     string `note` = 3;
    
      message OriginTx {
        // type of the transaction originating the mint process. Eg: Polygon, Ethereum, Verra,
        string type = 1;
        // the id of a transaction based on a type (tx id, serial number) 
        string id = 2;
      }
    }
  2. Add new attribute to Batch (and MsgCreateBatch):

    // maximum amount of credits that can be minted. Optional, when empty then there is no 
    // maximal supply. When the supply (tradeable_amount + retired_amount + escrowed_amount)
    // equals to max_credits the no more credits can be minted.
    // Zero is not a correct value.
    string max_credits.

    When a batch is locked, no new credits can be minted (this should be the default).


For Admin Use

clevinson commented 2 years ago

I think at a high level we first need to ask ourselves if fungibility of credits within a credit vintage is a desired feature.

For context, this was part of the initial goal of ecocredits design, but that use case initially did not require batches to allow for dynamic minting, as with native credits we assumed that a BatchCreate() would only happen at the conclusion of a monitoring period, and all credits for a given vintage could be issued at once.

So in light of a use case where we will be "bridging credits" from either off-chain registries like Verra, or from on-chain systems like Polygon, we need to decide whether we: 1) Preserve batch immutability 2) Preserve batch <> vintage parity

1) Preserve Batch Immutability

In this approach, we potentially lose the ability to have double counting prevention at the project & batch level. We've discussed previously that double counting of credits will likely happen off-cain (via proof submission / slashing mechanisms), but in addition to that I've imagined we could also have some double counting prevention directly onchain that enforces that enforce some basic restrictions on whether a project can have multiple batches issued to them of a given credit class (or credit type) with overlapping date ranges (batch_start_date and batch_end_date).

If we want to preserve batch immutability, while also preserve this ability for basic double counting prevention, then we should likely include some parameter at the credit class level that allows for batches to have overlapping time periods for a given project. This way, native credit classes can still be configured to prevent batches from having overlapping date boundaries for a given project.

If fungibility at this vintage level were still a goal, we could investigate leveraging the basket sub-module for enabiling fungibility of ecocredits at the vintage level, but I see this as a less elegant solution since it requires exchanging basket tokens back for the underly ecocredit batch asset when a user wants to retire a credit.

This approach probably still requires on-chain upgrades to be made, as I think its important that we have some on-chain prevention of multiple batches corresponding to the same serial number / txHash (from a bridge operation).

2) Preserve Batch <> Vintage Parity

@robert-zaremba I think outlines most the benefits of this approach in the issue description above. To summarize my thinking on this:

Pros:

Cons:

Native Credits Use Case

One thing that we haven't discussed yet, but I'd love @S4mmyb's input on -- is whether there is a desired use case for native credits to be able to dynamically mint credits to a batch after it has been issued. If there's a clear use case for this for native credits, then that probably makes a stronger argument for us supporting this direction. @S4mmyb can you give some context as to whether you see this as a use case for native credits as well?

If we do imagine a situation where native credits want to be minted to a pre-existing vintage that has already been created via a BatchCreate() method, will the issuer always know the max issuance for that vintage up front? Or will later recalcuatinos in methodology cause the amount of credits for that vintage to later change in ways that could not be predicted when the vintage was first created?

robert-zaremba commented 2 years ago

Re the cons - the new set of metadata, this is included in my proto proposal. We can assure on chain that there is no double minting or only on the bridge agent.

clevinson commented 2 years ago

Another point that @aaronc shared with me today, is making sure we're thinking about how we handle fungibility when there is a vintage that may have different source bridges. Do these also correspond with fungible assets (e.g. a fungible batch?)

To illustrate, let's imagine:

I think the trust assumptions here are orthogonal to whether we lump these into the same batch, but curious to hear other's thoughts

robert-zaremba commented 2 years ago

In fact the part about "trust assumptions" is the most important one.

  1. if we (the gov process) will trust the sources that the tokens are indeed originally registered by Verra AND the way the tokens are bridged is trustless or legit.
  2. otherwise that should probably create 2 different batches to eliminate the double spent risk.

In other words: we need to have that mapping between source contract and destination batch. For Toucan it can be automated. For other sources, we need to establish a process to ensure right trust level and expected outcome.

S4mmyb commented 2 years ago

Thanks for the deep dive into this @robert-zaremba @clevinson. A few thoughts here...

Native Credit Use Case


Speaking to @clevinson's question first, I don't forsee dynamic minting as a necessary feature for any native credits. Any credits native credits will, and should be issued via immutable batches which correspond to a monitoring round with a batch_start_date and batch_end_date.

The only similar use case I can think of would be ex-ante credit issuance, or the issuance of credits for expected ecological change, such as future carbon sequestration. The idea of ex-ante crediting is a popular idea for projects which want up front finance for implementation, however the functionality for this scenario do not correspond to ex-post credits as we have here in the bridge scenario. In an ex-ante scenario, credits should be issued and sold in an unretirable, untradable state until the time of monitoring at which point the amount can be adjusted and the credits become "unlocked". With that in mind, I don't think our decision here should be influenced by native credit use cases.


Challenges with the max_credit approach

Overall, I think @robert-zaremba's approach would work well and I agree with @clevinson's pro-con list. However, I do wonder how well the max_credit approach will scale as bridging popularity for legacy credits scales. To illustrate, I'll build on @clevinson's previous example.

Let's say Verra has 500 credits available for a 2015 vintage for the Mai Ndombe project.

In the scenario above, only 100 of the credits available from the 2015 vintage have actually been retired, but there are still 400 in existence. So what's the max_amount? While we could take Toucan's approach and say the max_amount should be 500, corresponding to the total number of credits originally issued in the 2015 Mai Ndombe vintage, the max_amount which could be issued on regen shouldn't be 500, it should be 400. If Alice retires her credits on Polygon, or the 100 credits left on the Verra registry get retired by Doug the amount be adjusted to 300. While we could take measures to ensure no double counting happens, I think it might be challenging to set the max_amount which properly traces where all these credits live and which have been permanently retired to properly ensure no double minting.

To me, I think the strongest long-term approach is a minimal trust system in which we don't need to rely on trusting the sources, such as an "Admin's responsibility to delegate issuing authority to bridges and users trusting the admin's delegation of trust," to ensure proper migration of credits.


Argument for Using Baskets

If fungibility at this vintage level were still a goal, we could investigate leveraging the basket sub-module for enabiling fungibility of ecocredits at the vintage level, but I see this as a less elegant solution since it requires exchanging basket tokens back for the underly ecocredit batch asset when a user wants to retire a credit.

I agree with @clevinson's point that this approach has it's challenges, but I wonder if we can turn it into a more elegant solution by using interfaces to mask the complexity. In other words, even though the basket approach to trading and retiring credits will require multiple operations to ensure vintage-level fungibility, can we design our interfaces to make it seem like these actions happen in a single operation?

Regardless of what we do, we will still need to establish a process to ensure the expected outcome as different bridging solutions emerge. However, this solution might might require a bit less trust because instead of relying on the Admin to delegate authority to bridges, we just need to rely on the credit issuers checking that all requirements outlined in the bridging process have been followed to ensure tokens are indeed originally registered by Verra AND the way the tokens are bridged is trustless or legit.

I haven't fully thought through this approach and I think there could be challenges with it as well, but I think it's worth considering as an alternative if we want to investigate it.

marceljay commented 2 years ago

@clevinson, you wrote:

* Does this correspond with 300 Verra credits on the "2015 Mai Ndombe batch" ?

  * This implies that its the credit class Admin's responsibility to delegate issuing authority to bridges (be them polygon bridge, or Verra bridges), and users trusting the admin's delegation of trust.

Which is an interesting thought for the whole ecosystem. At Toucan Protocol, we would face the same question, if another Carbon-Tokenizer-X wants to convert their tokenized VCU credits into TCO2s.

Most important is to ensure integrity within your governed system. So no matter whether these 100 and 200 respective credits are of different batches/classes or of the same batch/class, the integrity of the system is still at the same risk, because both ways the credits would be eligible for the same basket.

What would change from a user's perspective, if the trust for new issuance was not delegated from within the class, but from another class inside the system, yet credits could still be sold via the same basket. When assuming the case of external bridge compromise, I don't see any meaningful difference in mitigation outcomes, as the newly issued illegitimate issued credits likely already have been deposited to the pool.

On the other hand, if an external tokenization system (e.g. Toucan) would not considered legitimated anymore, the token-bridge can be deactivated to prevent further issuance. This again would not significantly change the system's integrity as a whole ("no new illegitimately inside basket NCT"), versus allowing continuous bridging of illegitimate credits to a different credit class.

clevinson commented 2 years ago

@marceljay One point to clarify - credits are not necessarily always eligible for the same basket. Yes this is true that in the current design the only capabilities for basket filter criteria are an allowed list of credit classes, but this is just our MVP implementation -- the fuller specification of the baskets module does include the ability to have basket acceptance criteria be more granular. In particular, one could imagine that a filter criteria for a basket could be "Verra credits issued by address_alice or address_bob", where address_alice represents EOA (externally owned account) or an onchain account such as a DAO or module/smart contract address in the future. Since basket criteria can be updated, then the governance of the basket could decide to remove a bridge issuer from the list of allowed issuers whom their basket will accept credits of that credit classes for.

clevinson commented 2 years ago

We spoke about this in a bit more detail on our Regen Ledger Dev+Product Sync today, where we go through open architecture and technical pieces on a regular basis.

While we didn't come to a 100% aligned decision, I think there was progress made and some clarity on a few important points. Here's an outline of where my current thinking is:

Fungibility of Batches

While there's a lot of ways to think about this question of what "batches" correspond to, its important to remember that the original architecture of batches was to represent the smallest level of "non-fungibility", or "traceability" in ecocredits. By design, ecocredits are always non-fungible by default amongst different batches, and fully fungible within a given batch.

Ecocredits from different bridges

We discussed in a bit more detail the idea of what our expectations are when there are multiple ecocredits coming from different bridges, but corresponding to the same project & monitoring period (e.g. vintage) ?

Let's again imagine:

What level of traceability is desired here? From our conversations on the product call today, it was clear that we don't wan't all of Alice + Bob's credits smushed together. This is because if later in the future one of the bridges becomes deemed faulty, Carole should be able to tell which assets of hers were issued from the Polygon bridge, and which ones from the Verra <> Regen bridge.

However, there wasn't a clear need for Carole's credits to separately track the credits that Alice bridged on Tuesday as different than the ones she bridged on Wednesday.

So when looking at Carole's ecocredit balance, the following is sufficient traceability:

{
  "C01-20150101-20151231-025": {
    "tradable_credits": 250
   },  
  "C01-20150101-20151231-231": {
    "tradable_credits": 100
   }
}

Where C01-20150101-20151231-025 is the batchId for Mai Ndombe 2015 credits bridged from polygon (and issued by the polygon bridge issuer address), and C01-20150101-20151231-231 is the batchId for Mai Ndombe 2015 credits bridged from Verra (and issued by the Verra <> Regen bridge issuer address).

Without a strong argumentation for preserving traceability down to an individual bridge action (e.g. always treating Alice's Tuesday & Wednesday credits as being tracked in different ecocredit batches), my personal preference is for us to allow for batches to have dynamic minting, but only allow the issuer of the initial batch to be able to mint more credits to that batch.

Cardinality of Batches

Another way to think about tradeoffs for this is to look at how the cardinality of batches may change in these different approaches. Let's imagine a situation where we have 10 Polygon bridge actions per day over 6 months, and these bridge actions are distributed over 100 different Verra project vintages.

If we go with the updatable batch per issuer approach, we'd have 100 (project vintages) * 1 (bridge issuer) = 100 batches.

If we go with immutable batches (new batch per bridge action), then we'd have 10 (bridge actions / day) * ~180 days (days/6months) ~= 1800 batches.

max_credits vs sealed boolean

From @S4mmyb's feedback, it seems like setting a limit of "max_credits" doesn't do much good in actually preventing false credit issuances, as there could always be credits previously retired on Verra that can never be bridged.

It seems then like setting a boolean configuration (probably at both the batch & credit class level) to allow for dynamic minting within batches (which can be updated later to seal a batch) more favorable than a max_credits field.

Next Steps

@marceljay Our internal team will be meeting again to go over these points tomorrow during our ledger team standup. If we end up going with the above approach, we'll be turning our attention to refining #937 and align on the protobuf API. We welcome your input there as well!

robert-zaremba commented 2 years ago

@marceljay I think your points are more related to the bridge itself, rather than the batch functionality. In any case - whether we allow dynamic issuance in existing batch or creation of new batch, there is an accounting problem if the system is compromised, since we want to directly enable these tokens in baskets and IBC.

robert-zaremba commented 2 years ago

Thinking more about this, I would prefer a system which is simple, solid and user friendly.

It seams that the arguments against dynamic issuance are that we want to distinguish origin and the registration event for every batch. So baking two abstractions in the single object: a vehicle moving assets to Regen and a vintage. Toucan has two different contracts for that: Batch (a vehicle) and TCO2 (which represents the vintage).

If we think that one type is enough, then I think we can solve the registration (vehicle) using events, which have it's own abstraction on chain anyway.

Note about authorization

We can use the x/authz - a mechanism in Comos SDK to authorize any account for execution of specific messages (transactions) on behalf of other account. Eg, credit class would be controlled by a multisig, but batch issuance could be delegated to a bridge account using x/authz.

clevinson commented 2 years ago

@robert-zaremba I'm not sure I follow why you're referring again to arguments against dynamic issuance. In my latest comment, I said that I was in favor of a limited version of dynamic minting:

Without a strong argumentation for preserving traceability down to an individual bridge action (e.g. always treating Alice's Tuesday & Wednesday credits as being tracked in different ecocredit batches), my personal preference is for us to allow for batches to have dynamic minting, but only allow the issuer of the initial batch to be able to mint more credits to that batch.

Do you feel aligned with this approach? Let's decide on this high level architecture first before talking about whether or not events can solve some part of puzzle.

aaronc commented 2 years ago

@clevinson and I discussed dynamic minting yesterday and I'm in favor of allowing it. Here's the approach that feels most sensible:

clevinson commented 2 years ago

This generally sounds good to me. What's the purpose of enable_future_minting at the MsgExtendBatch level? Did you mean having it be a field in MsgCreateClass and MsgCreateBatch instead of MsgCreateBatch and MsgExtendBatch ?

One note is that we discussed having this field (enable_future_minting) be something that can later be set to false to "Seal" the credit batch, rendering it immutable at some later point.

I'm not actually sure if this is necessary, as if we're restricting dynamic minting to only the "bridge" use case, then I cannot imagine a scenario where we would actually want to seal a batch. Most batches will remain open indefinitely unless there is a point where we decommission a bridge, but at that point we would more likely remove the bridge address from the issuer list entirely.

robert-zaremba commented 2 years ago

why you're referring again to arguments against dynamic issuance.

Probably I was not clear enough. I'm always in the support for dynamic minting, especially with the limited version (my initial design had a lock attribute which would refer to batches created "natively"). In https://github.com/regen-network/regen-ledger/issues/924#issuecomment-1082959725 I wanted to give more arguments against complexifying the design, and saying that if needed then we should handle it in better way (like mirroring the model used by Toucan, where batch and vintage are 2 different things).

robert-zaremba commented 2 years ago

I've updated the PR. Let's continue the discussion there.

@aaronc :

Let's continue the discussion in the PR: #937