polkadot-js / extension

Simple browser extension for managing Polkadot and Substrate network accounts in a browser. Allows the signing of extrinsics using these accounts. Also provides a simple interface for compliant extensions for dapps.
Apache License 2.0
965 stars 403 forks source link

Unable to use the `ChargeAssetTxPayment` signed-extension on Westend Asset Hub #1314

Closed josepot closed 3 months ago

josepot commented 4 months ago

As I'm sure that you already know: It is possible on Polkadot-SDK runtimes that support the ChargeAssetTxPayment signed-extension to pay the transaction fees using an asset that can be swapped for the native currency of the chain.

In some runtimes the ChargeAssetTxPayment signed-extension is configured to use a u32 for indicating a "local" asset_id present on that particular chain, that is currently the case on "Polkadot Asset Hub".

However, the ChargeAssetTxPayment signed-extension could also be configured to use (for instance) an XcmV3MultiLocation path indicating the asset. This is currently the case of "Westend Asset Hub", and -unless I'm mistaken- that will also be the case for "Polkadot Asset Hub" at some point in the not so distant future.

The issue is that the PJS extension crashes on dApps that try to leverage this signed-extension when the runtime is configured to use an asset_id different than Option(u32):

image

This happens even when the extension has been updated with latest metadata. Also, the "Back to Home" button does nothing, and the only way to get the extension functional again is to disable it and to re-enable it.

Therefore, it seems that the PJS extension assumes that the asset_id field of the ChargeAssetTxPayment signed-extension will always be an Option(u32), which is not a safe assumption. The extension should, instead, check on the metadata what's the actual shape of this field and present and sign the data that's receiving from the dApp according to the type that's defined on the metadata.

FWIW that's what Polkadot-API does. Meaning that when a developer is trying to leverage this signed-extension on "Westend Asset Hub" the types look like this, and the types of that same field on "Polkadot Asset Hub" are a plain number, while on chains that don't support the ChargeAssetTxPayment the types won't even allow the developer to pass the asset field.

I came across this issue thanks to @jameshih who asked me to "translate" the following POC dApp that he had built using PJS to Polkadot-API. What this POC dApp does is:

Transacting with JOE TEST TOKENS and paying transaction fees in JOE TEST TOKENS, no WND required!

I did translated it to Polkadot-API and also added a few new goodies (like selecting the Account provider and automatically updating the balances... among others). I have deployed the translated PAPI version in here. So, as long as you have an account with JOE test tokens on Asset Hub Westend, then you should be able to easily replicate the problem.

As of right now, Talisman is the only extension that doesn't crash and that it's able to correctly create the transactions on Westend Asset Hub. Although, it's currently not displaying to the user the asset_id field that the user is about to sign :/.

TarikGul commented 4 months ago

We sorted this out in the polkadot-js/api via 5752 which was released in 10.11.2.

So the most recent version of the extension 0.46.6 contains polkadot-js/api 10.11.1 which is behind. We need to do a release this week, and it should be resolved. I will also do a sanity check tomorrow to make sure my assumptions here are right.

josepot commented 4 months ago

We sorted this out in the polkadot-js/api via 5752 which was released in 10.11.2.

So the most recent version of the extension 0.46.6 contains polkadot-js/api 10.11.1 which is behind. We need to do a release this week, and it should be resolved. I will also do a sanity check tomorrow to make sure my assumptions here are right.

This is highly problematic.

It should accept the scale encoded value instead, because the way that PJS decodes scale-encoded data is an implementation detail of the PJS library, and this API should be a generic signer.

TarikGul commented 4 months ago

It should accept the scale encoded value instead, because the way that PJS decodes scale-encoded data is an implementation detail of the PJS library, and this API should be a generic signer.

It does accept it though, type AnyNumber allows it to be inputted as a string or Uint8Array. object is just there for users who want to input the direct MultiLocation as an object.

josepot commented 4 months ago

It should accept the scale encoded value instead, because the way that PJS decodes scale-encoded data is an implementation detail of the PJS library, and this API should be a generic signer.

It does accept it though, type AnyNumber allows it to be inputted as a string or Uint8Array. object is just there for users who want to input the direct MultiLocation as an object.

But how will it know whether that encoded value is meant to be a u32 or a multi-address? is it actually checking that against the metadata? šŸ¤”

I mean, I know that the types accept binary data... The types also accept binary data for many other fields (specVersion, nonce, era, tip, etc). However, if you try to pass the properly scale-encoded data for those fields, then things blow up... Ask me how I know that... šŸ˜‰

TarikGul commented 4 months ago

But how will it know whether that encoded value is meant to be a u32 or a multi-address? is it actually checking that against the metadata? šŸ¤”

AFAIK it knows what the type should be based on specVersion, and specName. It's mapped out in the override types. For example: https://github.com/polkadot-js/api/blob/master/packages/types-known/src/spec/westmint.ts#L50-L61

So in this example: when SpecVersion for westend-asset-hub is greater than 9434, it will resolve to: TAssetConversion: 'Option<MultiLocation>'

TarikGul commented 4 months ago

Ergo, it won't work if I pass a Uint8Array or a Hexadecimal value because it's not actually checking how it should be decoded against the actual metadata.

I'll need to check this a little deeper as well. Not completely up to date with the implementation in the extension.

Are you using the store version or building from master?

josepot commented 4 months ago

Does this work for you with the version that's about to be released? šŸ™

TarikGul commented 4 months ago

Does this work for you with the version that's about to be released? šŸ™

Says "Team access required", can't get in.

josepot commented 4 months ago

Apologies! It did work with the build of the latest master. I mean, it's not displaying to the user the assetId information, which IMO is quite important, but at least the tx was successfully created:

image

That being said, I really think that these kinds of custom definitions are very problematic for many different reasons. The most obvious one is the fact that since metadata v14 it is possible to take that information directly from the metadata. Also, will the extension work if someone sets up their own parachain with this signed-extension? šŸ¤”

TarikGul commented 4 months ago

It did work with the build of the latest master

Yay! Thanks for letting me know! I had a feeling it should work (with master) :) That being said I see a few issues here that could be resolved.

  1. Print the AssetId information.
  2. Release a new version of the extension.
  3. Resolve the store version being many versions behind the github releases. It's currently on 0.44.1. Not sure why its so behind, and if we even have access to it, but I will look into that.

That being said, I really think that these kinds of custom definitions are very problematic for many different reasons. The most obvious one is the fact that since metadata v14 it is possible to take that information directly from the metadata

I understand the sentiment and agree that it's not ideal. I think it was solving a historical problem before things became more standardized. It was quite useful before metadata v14. (Kinda a side note, for documentation of historical types across runtimes it's amazing).

Also, will the extension work if someone sets up their own parachain with this signed-extension?

I need to look into this more. I know how it would work for the API itself when creating a type registry, but in terms of the extension I don't have a concrete answer, and need to work through the extension implementation more.

josepot commented 4 months ago

I put together a write-up sharing what my thoughts are on this topic. If there is a crucial point that I would like to make that is that: the extension should stop relying on custom-types ASAP, otherwise these issues will keep on popping up and it will be a never ending whack-a-mole game.

In short: any data that the user is signing, should have been decoded/interpreted using the type-definitions of the metadata (and not just the method). Therefore, the extension should stop relying on "custom definitions" ASAP.

Custom definitions were needed prior to metadata v14 as @jsdw explains in this post:

PolkadotJS was created in 2017 as a client which was capable of interacting with Polkadot (and later, its parachains). It required type information to know how to encode/decode things, but none was available, so it had to construct its own. It built a mapping from the name of a type to some description of how to encode/decode it (nowadays this is mostly here 1). Since the shapes of many types evolved over time, PolkadotJS would add overrides to its type information that would take effect in certain spec versions and on certain chains in order to continue to be able to understand their shapes. Thus, if PolkadotJS knew which chain and spec version you were targeting, it would be able to look up how to decode information for it.

Newer libraries like Subxt and Polkadot-API were able to leverage the type information in modern metadata and so have never evolved this ability, meaning that PolkadotJS remains the only way to decode historic information today.

Therefore, we should all be grateful to Jaco for having create this system, no doubt! Thanks to it now we have the means for decoding old blocks, which is awesome!

However, if there is one place where we should urgently stop relying on custom-definitions is in the PolkadotJS extension, because the only way to truly reflect what the user is about to sign is to match it with the definitions that are present in the metadata. Not to mention the fact that these custom definitions don't scale, that are redundant and very difficult to maintain.

jsdw commented 4 months ago

However, if there is one place where we should urgently stop relying on custom-definitions is in the PolkadotJS extension, because the only way to truly reflect what the user is about to sign is to match it with the definitions that are present in the metadata. Not to mention the fact that these custom definitions don't scale, that are redundant and very difficult to maintain.

I deinitely agree that using proper type information to encode stuff for the signer would be ideal!

The only counterpoint I'd raise is that if it proves difficult to do, then personally I'm ok with the hacky workaround (ie the type overrides) to get us by for now, given that the goal here is to just keep the lights on with PJS until users can be migrated to alternatives.

That said, I think we should have a go though at doing it "properly", because it'll be more robust in the long term if we do, and the type overrides approach may lead to more future work/issues as other chains start using, or update their usage of ChargeAssetTxPayment.

josepot commented 4 months ago

The only counterpoint I'd raise is that if it proves difficult to do...

It is not. We already have the tools for doing that, this week we were going to put together basic docs for it.

Also, I wouldn't mind putting together a demo for this.

given that the goal here is to just keep the lights on with PJS until users can be migrated to alternatives.

The problem is that we are a bit on a chicken and egg situation right now. What we certainly don't want is to tell all the users that have installed PJS-based extensions that they should throw those extensions away. We must find a way forward. So, as much as I agree with this mindset, the extensions are a bit of a different beast. We need to find a path-forward to improve them in a way that we don't break PJS dApps that are currently using them, at the same time that we enable better alternatives.

polkadot-js-bot commented 3 months ago

This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue if you think you have a related problem or query.