ubiquity / pay.ubq.fi

Generate and claim spender permits (EIP-2612)
https://pay.ubq.fi
8 stars 28 forks source link

[Contributor Proposal] [Do not merge] Account Abstraction - (Nextjs, WebAuthn, Safe) #223

Open Keyrxng opened 2 months ago

Keyrxng commented 2 months ago

Related to #219

Changes



QA:

QA comment with video

github-actions[bot] commented 2 months ago
Preview Deployment
[9b67c184a77be392d9732458402f7a7b5b73ebbb]()
[40aadb256b40d32071aae3fa78028b619d123a49]()
Keyrxng commented 2 months ago

We'd also need to implement some way for them to control their funds as if a non-web3 user comes in without a wallet and we create one for them (before the cards are rolled out) then they'd have no way to access the funds unless we provide them a UI as we are now technically their wallet provider, otherwise a way for them to securely swap and off-ramp.

molecula451 commented 2 months ago

the build doesn't suppose to be failing, does it, @Keyrxng

gentlementlegen commented 2 months ago

the build doesn't suppose to be failing, does it, @Keyrxng

I believe that some environment variables are missing according to the build

Error: AUTH0_CLIENT_ID and AUTH0_DOMAIN must be set in the environment
Keyrxng commented 2 months ago

The build is failing because CI is expecting the build to come from the esbuild script but it's coming from next build now which is not setup for CI. I should have made a better commit comment build ready should be next build ready, like I said if this was given the green-light a bit of a clean-up would be needed.

So my apologies if it wasn't clear before now but this is more of a reference PR in regards to account abstraction than an actual implementation PR. A few things to be decided first before an impl PR is allowed I think?

  1. Is a move to NextJS permitted? (if so I assume that should have it's own PR with minimal changes?)
  2. What type of smart account should be used, minimally extensible lightweight or fully extensible modular account?
  3. Is the smart account being created at bot-level (on GitHub) or as part of pay.ubq.fi user flow?
  4. How would the user access/control their funds once claimed into their smart account, pseudo-wallet UI or direct off-ramp?

This recent EIP proposes baking AA into EOAs, it's quite a hot topic on CT right now. It's only in review (draft > review > last call > final) so it's pretty far out but should be kept in mind I think.

0x4007 commented 2 months ago

It's interesting research. I'll be back on my computer early next week to be able to hopefully spend time on a deeper dive because I am not well read on all the latest in account abstraction.

Looking at the video: I had a far more lightweight implementation in mind. The key point would be leveraging the webauthn browser API.

I think all these clicks, modals (and in this case, redirects with more inputs) that are prevalent in this industry is bad UX. This should all be handled behind the scenes and as seamlessly as possible.


Further UX optimizations:

molecula451 commented 2 months ago

The build is failing because CI is expecting the build to come from the esbuild script but it's coming from next build now which is not setup for CI. I should have made a better commit comment build ready should be next build ready, like I said if this was given the green-light a bit of a clean-up would be needed.

So my apologies if it wasn't clear before now but this is more of a reference PR in regards to account abstraction than an actual implementation PR. A few things to be decided first before an impl PR is allowed I think?

  1. Is a move to NextJS permitted? (if so I assume that should have it's own PR with minimal changes?)
  2. What type of smart account should be used, minimally extensible lightweight or fully extensible modular account?
  3. Is the smart account being created at bot-level (on GitHub) or as part of pay.ubq.fi user flow?
  4. How would the user access/control their funds once claimed into their smart account, pseudo-wallet UI or direct off-ramp?

This recent EIP proposes baking AA into EOAs, it's quite a hot topic on CT right now. It's only in review (draft > review > last call > final) so it's pretty far out but should be kept in mind I think.

Yes, great initiative and im in favor sounds great some points:

  1. Yes, in fact we use nextjs in dollar so it's feasible
  2. I think we should let this as an user decision, but both sounds good to me
  3. It's an obviously use case to all the things web3 on the repo, but seems usable on one instance as first, say pay.ubq.fi repo as firrst
  4. A UI sounds betterr
molecula451 commented 2 months ago

if you can add generic js tests is plus

Keyrxng commented 2 months ago

great stuff, I'll carry it through to completion over the weekend taking these comments into account

Keyrxng commented 2 months ago
How it works

The private key is deterministically generated

Implemented using information from cookie based auth and MFA (webauthn) credential.

Save key in localStorage so we don't need to login again

Not implemented. We could rebuild the account pk using the cookie auth and browser-stored credentialPubKey and sign tx that way but only for aa accounts only, but that is already stored along with the wallet information and it heightens security by keeping those details gated to an authenticated user. But it's not a security risk storing that by itself but this statement below from the docs is why I done cookie-first auth. We could get away with it but it has trade-offs.

In order to prevent such information leakage, the Relying Party could for example:

Perform a separate authentication step, such as username and password authentication or session cookie authentication, before initiating the WebAuthn authentication ceremony and exposing the user’s credential IDs.

allow user to automatically register their connected wallet with their GitHub username for future permit generation

Both newly visiting EOA's and smart accounts are populated into the users and wallets tables for future use.

seeding gas money by subtracting from their upcoming permit upon registration of wallet

I was using a tweaked version of the gas faucet to fund the few accounts that I saw through to deployment but I've been thinking about this.

while the task is closed as complete-perhaps a new plugin

We could just use the faucet seeing as that is a worker, just setup a KV for it and track which newly created accounts both SMAs and EOAs have been funded and expose access to the KV through calls to the worker so other plugins/modules can use it


this would have to occur when they register on GitHub

So will the user entry point for wallet and account creation always be GitHub via slash commands? Considering what @molecula451 said about it being user choice between the two smart account versions, how would they choose that at a bot level via slash command?

I think it makes sense to either choose only one and have a dedicated section in docs or onboarding explaining things or create a new section for the claims portal. A visit to / without any claim permit could allow a user to create their new smart account, control their funds, export their private key and mnemonic so long as they are cookie-auth'd in order to create the account then also auth'd with webauthn in order to move funds, view private key etc.

If only one is being chosen it makes sense to use the Multi-Owner-Modular account because of it's extensibility with plugins etc it would likely work very well with whatever plans you have for ubiquity cards, I assume they go hand-in-hand with the long term vision being preferably everyone has at least an abstracted account, if not also the card too? But it would also make things easier to register at bot level, as we'd be able to setup, fund, debit and deploy all without leaving github.


QA

https://github.com/ubiquity/pay.ubq.fi/assets/106303466/ace8cf09-5c8e-4827-a691-b6061b54c917


four_funded_and_deployed


https://github.com/ubiquity/pay.ubq.fi/assets/106303466/4d5f453d-a875-4875-82ae-92e32a6e8b83

0x4007 commented 2 months ago

This is nifty but it's a bit hard to tell watching from my phone screen some of the details.

  1. Why do you have to webauthn multiple times?
  2. Why do you have to login with GitHub?
Keyrxng commented 2 months ago

Why do you have to webauthn multiple times?

The Request a Credential algorithm accepts a CredentialRequestOptions (options), and returns a Promise that resolves with a Credential if one can be unambiguously obtained, or with null if not.

I'm either grabbing or creating the account because we only have one entry point /?claim=.... So if they have an account the first navigator.credentials.get() signs them in. If they don't have an account and they accept and we try to grab a credential it'll return null or if they cancel in which case a new account/credential set is created.

The 2nd step (the 3rd step in the video) in the new creation process pulls info from the credential to use in creating the account and just makes sure nothing went wrong in creating the credential like a timeout of the OS request or some other cross-platform issue.

The createNew step is skipped if they have an account already but if /?claim=... is the default entry, we can reduce the steps down to two by first checking Supabase for their credential public Key.


If we chose to store a flag in localStorage to indicate they've created a passkey (their authenticator already knows they have but we do not) and we have no Auth0/Supabase step and somehow have user info for the credential creation, if they remove it and we create an account based on this flag then we'd create a new credential set in their authenticator making it cumbersome on the user to be able to identify and manage their separate credential sets/smart accounts. Otherwise we do a three step check and we could even add some UI changes that prompts the user if they are sure that they want to create a new credential set/smart account because as of right now we only offer 1 default account any previous would be overwritten or would fail.

With Auth0/Supabase we can be sure to always return the correct credentials and allows us to handle multiple credentials/smart accounts if that is a feature to be offered (does that fall inline with how cards will work maybe?)

My suggestion in that a visit to / could be the entry point to create a new account and credential set then any visits to /?claim=... would only ever request credentials.


Why do you have to login with GitHub?

When creating the credentials we need to fill in identifying information from us as the host and them as the user and some other settings.

publicKey: {
        challenge,
        user: {
          id: new Uint8Array(0),
          name: "",
          displayName: "",
        },

As recommended, gated behind auth0 we have multiple layers of security and more entropy for creating the new account with so long as the info we are using from that step is secured properly. I've tried to make it as secure as possible while still being deterministic.

Perform a separate authentication step, such as username and password authentication or session cookie authentication, before initiating the WebAuthn authentication ceremony and exposing the user’s credential IDs.

This statement from the docs is in regards to a user being identifiable with the easily accessed credential information through the credentialID (which is used as entropy for the pk), so a better way to handle it is to gate it behind an additional auth step.


But it would also make things easier to register at bot level, as we'd be able to setup, fund, debit and deploy all without leaving GitHub

I forgot to add this implies that they first setup their credential via "/"

0x4007 commented 2 months ago

Cloudflare login uses my iCloud so I figured there is some OS level abstractions we can take advantage of for our identities instead of logging into GitHub as an extra step.

The cloudflare UX prompts an OS modal with a single button to login. When I press it, it does faceid and I log in right away. It's really seamless and seems to have all the strengths with none of the weaknesses.

If for some reason that's not viable for non Apple devices, an alternative suggestion is to generate a unique device fingerprint/signature (theres techniques that use webgl) to deterministically generate a dedicated signing key per device.

Then we could in theory allow multiple claim addresses per user although that would require a forked version of permit2. But this would be great for private key security because the user doesn't ever need to export or save it elsewhere. The device should be able to deterministically generate it every time.

0x4007 commented 2 months ago

To clarify, I just test at the Cloudflare login again and I'm specifically referring to the two factor auth. This doesn't require an iCloud ID but it's shared in my "keychain"

Which iCloud syncs across all devices.

Keyrxng commented 2 months ago

Cloudflare login uses my iCloud

Which is just another MFA auth step like GitHub login, but I hear what you are saying.

The device should be able to deterministically generate it every time.

The pk could be generate deterministically with only the information from the passkey although you'd want to add in some secure extra entropy not attached to the key. And we could remove any other auth step like Github or Auth0.

I avoided it because it seemed insecure to me at first but I can push this ahead with just WebAuthn if that's acceptable, which by the sounds of your comment it is and enhancing pk generation/security can be hashed out later? pun intended

Keyrxng commented 2 months ago

Your keychain is part of your iCloud ID. I'm not sure how it works with Android so I can't comment but it's my understanding that they go hand-in-hand

iCloud Keychain can also keep the accounts you use in Mail, Contacts, Calendar, and Messages up to date across all your iPhone and iPad devices and Mac computers.

When you turn on iCloud Keychain on an additional device, your other devices using iCloud Keychain receive a notification requesting your approval of the additional device.

On one of your other devices, approve the additional device. Your iCloud Keychain automatically begins updating on the additional device.

To approve iCloud Keychain when you don’t have access to your other devices, follow the onscreen instructions to use your iCloud Security Code.


Maybe I misundestand.

You are logging into Cloudflare using your keychain. I tried "login with apple", is that the same as what you are seeing/doing?

image

Keyrxng commented 2 months ago

Yeah I tried on my mobile too and same affect

image

0x4007 commented 2 months ago

I wanted to include a video but I'm unsure how to censor the key it displays. To be honest I'm not even sure if that key is considered sensitive data.

image

I can also 2FA with my hardware wallets for other accounts. Basically there are APIs to inject keys into the browser with elegant OS level integrations that we should be taking advantage of.

I'm not so concerned with entropy because the raw inputs aren't "exportable" by the user. I don't see how it can be leaked if we don't provide that option in the first place.

We could consider browser fingerprinting so that it's all seamless but still not sure that this additional entropy is necessary.

However that would be a really nice architecture for a user to be able to automatically add their generated (browser fingerprint based) device keys to a gnosis safe where all their assets are. They can conveniently "2fa" with another device they own. Key rotation shouldn't be necessary if we don't save the key anywhere (it can be deterministically generated every time)

In this case perhaps it makes sense to add that additional entropy via a login (maybe GitHub or something better?) so that a malicious actor won't use our same algorithm to make a fake page to generate the key and save it on their database

(I understand now why you wanted to make additional entropy, it's so that their browsers can't be easily phished with a fake page?)

Another seamless way to handle this without a login: what if we use a posix timestamp as a salt and then save the generated key in localStorage? At this point, they would need a way to export the key which is not desired.

However the benefit is that even with the same algorithm, a phisher won't be able to generate the same key.

Pros and cons...

I wish there was a way to not necessitate exporting/backups. One last idea is using a password as a salt, combined with a browser fingerprint. Autofill can make it seamless for the user to authenticate. That way the user just needs to remember the password and the device they used. We can encourage users to have minimum two keys connected to their claims so that if they lose their phone etc they can still manage their account from the other? We could also consider the GitHub user as a super user for their account so they can always manage their keys even if all their devices are unavailable.

0x4007 commented 2 months ago

We could black box the key generation algorithm by distributing it as a compiled wasm module.

Keyrxng commented 2 months ago

gnosis safe

do you mean a literal Gnosis Safe or are you using that interchangeably with just an abstracted smart account? I see Gnosis have rolled out points for transacting with your own Safe which is cool, maybe swap out Alchemy as the AA provider then and go with Safe?

it's so that their browsers can't be easily phished with a fake page

Sort of, the passkeys are tied to the domain that they get registered to so it's already sort of 'unphishable' in that sense.

But if the whole AA system is based off of just the user-accessible passkey info, I believe all that would be needed on the attacker's part is to manage to hack their iCloud account. Or if a user was MITMd their info would be revealed after a successful passkey request, an attacker would likely be able to use this and the algorithm to reproduce the pk or just access the website with the user's account logged in via browser ext for example (with github too, if the user has an icloud passkey saved they are fucked so maybe that's redundant anyway?). Will access to their card(s) use the same credentials/auth flow for moving funds etc?

It seems shaky to me at least to not include additional entropy somewhere that either only UBQ can produce or that we can tie to a user regardless of device, location, browser, etc. and confidently reproduce it.

save the generated key in localStorage?

I would not save my wallet private key anywhere on a browser personally and I don't think it would be a huge selling point of a novel AA system, but idk. If you meant the passkey "key" (lol) then with entropy it's not totally unsafe I guess. But what happens when they get a new laptop or clear their localStorage, bye bye smart account unless we always first attempt to retrieve a set of credentials and if null is returned then we create a new set (with "are you sure?" prompts maybe), providing whatever entropy we are using can be securely reproduced.

password as a salt, combined with a browser fingerprint

I like the sound of this actually

two keys connected to their claims

For me browser is the natural claiming point, I don't dev on my mobile and mobile web3 sucks ass.

As part of the account creation path, it should be mandatory to create a passkey from your desktop browser and another from your mobile, then we are totally hands-off.

You can likely imagine the scenes with a huge devpool base with user errors with an optional 2nd key.

So I'd suggest making it mandatory with disclaimers and warnings that the onus is on them after account creation regarding account access.

Although I'm not sure of how the UBQ cards are coming together but depending on how those cards are "owned", does it make sense to have AA be foundational in how the cards are constructed or vice-versa? I imagine the cards will be Fort Knox and their smart accounts should be too, but will this early account abstraction system being described here work well with cards?

I wish there was a way to not necessitate exporting/backups.

So no-one really knows the private key for the wallet, it's just produced and only ever produced on ubiquity domains via the wasm module? (credentials can be shared across domains)

That's exactly like being a custodial wallet provider like a CeX.

"not your keys, not your coins" springs to mind.

With both the cards and AA coming into play, it might be an idea to define exactly what sort of "wallet provider" Ubiquity is going to be as that is certainly the direction it's all heading with the need for UI's for handling funds etc.

0x4007 commented 2 months ago

That's exactly like being a custodial wallet provider like a CeX.

"not your keys, not your coins" springs to mind.

That's not at all what I'm proposing. The keys are in the users browser. They should never be transmitted off of the device. We don't have any access to their keys.

The cards have almost nothing to do with blockchain right now. The current plan is to transfer ubiquity dollars to a "card minter contract" and then we mint a new card up to $1000 and give it to the user. It's our bank account. In order to reimburse it, we take the ubiquity dollars and redeem them for USD.

Sort of, the passkeys are tied to the domain that they get registered to so it's already sort of 'unphishable' in that sense.

This is great!

As part of the account creation path, it should be mandatory to create a passkey from your desktop browser and another from your mobile, then we are totally hands-off.

You can likely imagine the scenes with a huge devpool base with user errors with an optional 2nd key.

The idea is that each key technically weakens the users security. But if they keep adding keys to their account to be able to claim their rewards then it's more convenient for them. I imagine that the signing key can be for signing but the funds should really be stored elsewhere.

So in this case maybe a modified permit2 contract makes sense for any user to be able to claim any permit. This means that we can use any signer to claim (and transfer) to their registered wallet on GitHub.

The thing is, when I personally receive rewards, I like to claim them instantly. I imagine others may also do the same because it ensures that you can't get rugged on your payment if it's already in your wallet.

Keyrxng commented 2 months ago

Started anew and took it in the new direction discussed.

This means that we can use any signer to claim (and transfer) to their registered wallet on GitHub.

I don't see why using the faucet to fund (with kv upgrade) isn't a suitable approach?

Or a relayer setup to process claims from the portal domain and only if reward.owner == ubiquibot which would be best served as a worker for KV for funding debits.

Soon as an account is registered (before any tasks are completed because no address yet) let worker deploy user Safe with PK alg inside worker, save Safe address to db for permits and store fee in KV. Permit generation or new plugin fetches from KV and applies debit before final payment output. This is a much better approach I think because we could build another auth step into the worker.


Safe are working on implementing adding passkeys to existing Safes through their SDK so I think it might be an idea to focus on a more secure pk algorithm and multi-passkey to account features once it's rolled out via the SDK?


Changes



QA:

For the sake of QA I manually funded a previously created passkey address so I knew the address for funding but I've left out any sort of funding logic from this new approach

https://github.com/ubiquity/pay.ubq.fi/assets/106303466/2df79995-eed2-4a0a-808a-58af74abc142

rndquu commented 2 weeks ago

@Keyrxng This PR touches many (if not all) of the project files. So the changes are too big for a single PR.

We could divide it into issues like:

  1. Move pay.ubq.fi to Next.js (create an issue first, perhaps Next.js is not the best option but I'm ok with it)
  2. Reseach: select account abstraction provider (already created at https://github.com/ubiquity/pay.ubq.fi/issues/219) so that contributors could claim without a web3 wallet in a gasless way
  3. Add account abstraction support
Keyrxng commented 2 weeks ago

@rndquu I agree with you at least for the most part we know what's expected.

  1. I'm not against any framework and open to suggestions other than Next, it's my preference but I'm not dead set on it. Please add your preferred framework those who see this.

  2. Gnosis for everything is the decision isn't it, full integration with all Gnosis services and a WebAuthn based AA model.

Would it be better to wait for #226 before migrating to a new framework I know @EresDev proposed it in the PR, thoughts on that?

rndquu commented 3 days ago

@Keyrxng

TLDR; As far as I understand, AA feature depends on https://github.com/ubiquity/pay.ubq.fi/issues/174

I'm not against any framework and open to suggestions other than Next, it's my preference but I'm not dead set on it. Please add your preferred framework those who see this.

I'm also ok with Next

Would it be better to wait for https://github.com/ubiquity/pay.ubq.fi/pull/226 before migrating to a new framework

Yes


Regarding the https://www.keyrxng.xyz/blog/let's-buidl-webauthn-account-abstraction article (which is awesome by the way).

Security considerations:

  1. The article describes generating PK purely in the browser while webauthn standard requires the challenge parameter to be generated on the server side. Not sure the described approach is safe.
  2. The article describes generation of mnemonic seed in the browser this way:

    // the salt could be a user password, pin, etc...
    function generateEOA(userMetadata, passkey, sessionAuth, salt) {
    const { name, displayName } = userMetadata;
    const { id } = passkey;
    const { auth_account_created_at } = sessionAuth;
    
    // Concatenate the data into a buffer then hash it
    const concData = keccak256(
    Buffer.concat([Buffer.from(name), Buffer.from(displayName), Buffer.from(id), Buffer.from(auth_account_created_at), Buffer.from(salt, "hex")])
    );
    
    // hash the concatenated data which has already been hashed once
    const hash = crypto_generichash(crypto_generichash_BYTES, concData);
    
    // create a private key from the hash of the twice hashed data
    const privateKey = "0x" + createHash("sha256").update(hash).digest().toString("hex")
    
    // generate a mnemonic from the private key using ethers
    const mnemonic = generateMnemonic(privateKey);
    
    // create an account signer using alchemy from the mnemonic
    const accSigner = LocalAccountSigner.mnemonicToAccountSigner(mnemonic);
    
    // get the public key from the account signer
    const publicKey = await accSigner.getAddress();
    
    return { mnemonic, publicKey, privateKey };
    }

Most of the params which form a mnemonic's PK (i.e. seed) (name, displayName, auth_account_created_at) can be either easily found out either brute forced. The only (besides the salt param that's the topic of another discussion because it means that we should pass PK with mnemonic from the server to the browser which eliminates the rule "PK is stored only on user's device") unique param is passkey.id. Not sure it is safe to use it this way because it's not clear what stops other malicious JS code from calling navigator.credentials.get() and reconstructing the mnemonic's seed. This way any XSS becomes critical. Perhaps there exists a best practise of how to generate bip39 mnemonics using webauthn.


Regarding AA solutions.

I don't really like any of them since (at least nowadays) they are:

  1. Paid
  2. AA topic is not standardized yet
  3. AA field is changing too fast
  4. AUTH and AUTHCALL are not yet supported
  5. Too complex thus error prone

I would wait for some time for a free battle-tested AA solution and in the meantime stick to a simple approach with https://github.com/ubiquity/faucet.


Another point is that right now (with the current permit generation schema) we need to know permit spender ahead of time (i.e. on permit generation) while with webauthn we can generate user's PK only on claiming which would force us to change the whole permit generation flow. My point is that webauthn (and AA to some extent) requires the feature of consolidated permits (when we store permits in a DB and generate them only on user demand, example flow).

So the flow could be:

  1. User solves his 1st github issue worth 100 WXDAI
  2. User opens pay.ubq.fi
  3. User clicks "Generate permit"
  4. Webauthn generates user's PK (considering the security notes mentioned earlier are not real issues)
  5. Server registers user's address
  6. Server funds user's address
  7. Permit is generated
  8. We sign the newly generated permit with PK from webauthn and send a permit tx

This way we get "web3-free gasless permit claims".

Keyrxng commented 2 days ago

@rndquu a lot of great points here ty.

WebAuthn standard requires the challenge parameter to be generated on the server side. I initially planned to use a Next.js/React stack and leverage the server accordingly, exposing only the credential mid-request from the client.

Our goal is a frictionless, no-signin, no-MFA, no-backend approach, which means circumventing some best practices. We’re exploring what’s possible.

I considered a worker that handles everything from pk alg to transaction handling, with strong authentication and encryption if the Next.js/React approach isn’t viable.


Most parameters forming a mnemonic's PK (name, displayName, auth_account_created_at) can be brute-forced. The docs explain that displayName and name are arbitrary and createdAt has limited entropy. I omitted some details in my blog due to uncertainty about sharing them.

Required user entity values (name, displayName, id) are arbitrary and could be defined by the user at registration and not their GH info. Supabase auth UUID and OAuth identity UUID are user-specific and hard to brute-force. GitHub login acts as an auth checkpoint, providing UUIDs to secure the pk further.

Salts add security assuming the user maintains security on their end:

An organizational salt integrated into the algorithm adds another layer of security. Proper CSP implementation and minimal browser storage can help prevent malicious JS from calling navigator.credentials.get().


Balancing security and friction is key. Hacking this system requires compromising three high-entropy data points, keychain/biometric security and/or a successful MITM attack.

A Next.js-based server-side handling of pk and transactions exposes only the credential ID and wallet public key to the client.

User flow:

  1. User calls /start, bot says No wallet, first register at <url>
  2. User registers with full auth flow.
  3. User calls /start, gets permit, we fund with faucet, then full auth flow with GitHub-WebAuthn to claim permit.

For a hacker to claim the permit, they’d need to bypass all security measures and even then they'd only be able to claim it for the user.

We’ll use https://github.com/ubiquity/faucet for simplicity. Focus on securely building, transmitting, and using the PK, then fund the address with the faucet. Future enhancements can include smart accounts and sponsored transactions.


Multiple static UIs are good for quickly iterating but the end product would be far better served as one full-stack app that encapsulates registering, claiming and moving funds. The credential is scoped only to the portal, we don't allow users to "export" and it's a better UX to have one place that consolidates all they need as opposed to needing to hop between domains for certain actions.

I think:

Auth:

This consolidates the signer in one codebase, simplifying the pk module. EDIT:

Had GPT re-write to try shorten it

0x4007 commented 1 day ago

Multiple static UIs are good for quickly iterating but the end product would be far better served as one full-stack app that encapsulates registering, claiming and moving funds.

Agreed so lets not jump the gun. Instead, I recommend we create multiple dedicated repos.