penumbra-zone / web

Apache License 2.0
14 stars 16 forks source link

Ignoring height_created field in NoteRecordSchema #83

Closed Valentine1898 closed 1 year ago

Valentine1898 commented 1 year ago

https://github.com/penumbra-zone/web/blob/fd9d82057f69d2d333b8dcd6982095733a768360/packages/types/src/state-commitment-tree.ts#L72C2-L72C2

I noticed that we just ignore the height_created field. The point is that it's not an optional field, and that means we can't serialize the data that we saved in indexed_db back to wasm (e.g. it would break the transaction_plan creation). One more thing, if we just add a line to NoteRecordSchema

  heightCreated: z.bigint().optional()

Then it causes a validation error

service-worker.js:2 ZodError: [
  {
    "code": "invalid_type",
    "expected": "bigint",
    "received": "string",
    "path": [
      "new_notes",
      0,
      "heightCreated"
    ],
    "message": "Expected bigint, received string"
  },

Also, SpendableNoteRecord (NoteRecordSchema) is not a custom type for wasm, it's part of proto and can be imported from buf SpendableNoteRecord

P.s. I think for heightSpent we would also get a similar validation error if it was not always undefined

Valentine1898 commented 1 year ago

I think we should try to use types from buf wherever possible. This will allow for minimal code changes when the types in the penumbra repo change

grod220 commented 1 year ago

I think we should try to use types from buf wherever possible. This will allow for minimal code changes when the types in the penumbra repo change

Unfortunately, dealing with protos and types is still an ongoing painpoint that isn't quite solved for this repo.

I also had some discussion on discord when I first discovered, in my view, this large problem: https://discord.com/channels/824484045370818580/979785365437165658/1149434539123421335

In short, the typescript definitions cannot be trusted as generated. Some fields are required when they aren't and some are optional when they aren't. This means, we need to do validation by:

Let's take the penumbra repo for example. There are these helpful attributes that handle both of these goals:

#[serde(try_from = "pb::SpendInfo", into = "pb::SpendInfo")]
pub struct SpendInfo {
    pub note_source: NoteSource,
    pub spend_height: u64,
}

Rust types are written manually with the actual correct required/optionals and used throughout the repo. However, it stays in compatibility sync with the protos due to the attribute.

On the web side, we are writing types using zod and it validates at runtime that we got what we expected. If not, it will throw in dev and console an error in prod. But, as you allude to, it's unfortunate we are not using the these typescript types that are automatically generated from the protos. It would be nice if we had a kind of typescript decorator that did exactly what those Rust attributes are doing. However, in Typescript, the types are a kind of separate universe to the runtime code. And there isn't any tooling to help us do this.

Some additional ideas:

grod220 commented 1 year ago

We manually write a kind of try_from methods to match the zod types with the typescript proto types.

The more I think about it, I believe this may be the best way forward. This binds our types with the protobuf definitions. After a protobuf definition change, if it's breaking, it will break the build (which we want to happen).

I took some time to start this kind of work with the transaction types: https://github.com/penumbra-zone/web/pull/87. Because the extension serves as a grpc service, we need to do these conversions regardless. Putting them in package/types and having helper functions to do the conversions is likely the cleanest way to do this.

grod220 commented 1 year ago

After digging into the original issue, funny enough, zod is right: https://github.com/penumbra-zone/web/issues/83. This makes me believe even more that this kind of validation is necessary. The protobuf typescript type says it's a bigint, however, it in fact serializes to a string. The error you are receiving is trying to inform you of this. I'm not sure if this effects the wasm crate at all. However, it's always been this way. But now, we know about it, when before we thought we were passing around a bigint and it wasn't. Should we close this issue?

Valentine1898 commented 1 year ago

I tested an alternate generator of TS types from proto. It is provided as a buf plugin It supports the proto3 optional keyword and has many customizations. It might make sense for us to use it image

Valentine1898 commented 1 year ago

In the example above, I used opt - forceLong=string and added the keyword optional in view.proto for height_spent field

  optional uint64 height_spent = 6;
grod220 commented 1 year ago

@Valentine1898 can you push that to a branch and share it here?

hdevalence commented 1 year ago

I'm still not totally clear on what the problem is with using the proto-derived types? I think it would be great if we could avoid rewriting all of the domain types in Typescript if possible. It's already a big pain point in Rust and it would be nice not to multiply our pain.

Valentine1898 commented 1 year ago

Pushed here https://github.com/Valentine1898/penumbra/commit/30eac866336274ab37a2972128b8f5be9ff7d188 To generate, simply run buf generate

grod220 commented 1 year ago

=== Update === After pairing with folks internally, we discovered that the Typescript type issues were a consequence of the json serialization that was occurring in the calling of wasm functions. Because so much is downstream of scanBlock(), a large portion of the codebase was reliant upon these json objects (which exhibited all of the painful traits I mentioned above) and required re-writing types to rectify. The solution is to parse the results from the wasm crate back to the proto-types using the fromJson method. This will also handle validation and zod usage can be scaled back to only those types not defined in the protos.

Tomorrow I'll be working on implementing these fixes 👍