zenstackhq / zenstack

Fullstack TypeScript toolkit that enhances Prisma ORM with flexible Authorization layer for RBAC/ABAC/PBAC/ReBAC, offering auto-generated type-safe APIs and frontend hooks.
https://zenstack.dev
MIT License
2.07k stars 88 forks source link

@@validate() fails validation when update payload doesn't have property. #1078

Closed nwidynski closed 7 months ago

nwidynski commented 7 months ago

Description and expected behavior When implementing model-level validation and running UPDATE actions, Zenstack doesn't pass non-existent properties. Since a property isn't mutated when its not passed, it should not invalidate the action.

Screenshots

model Counter {
  id String @id

  name String
  value Int

  @@validate(value >= 0)
}

const client = new PrismaClient();

const counter = await client.counter.create({
  data: { id: "1", name: "It should create", value: 1 }
})

//! This query fails validation
const updated = await client.counter.update({
  where: { id: "1" },
  data: { name: "It should update" } 
})

Environment (please complete the following information):

ymc9 commented 7 months ago

Hi @nwidynski , thanks for filing this! I'll look into it and make a fix soon if it's confirmed a bug.

ymc9 commented 7 months ago

Hi @nwidynski , I've pushed a new v1.10.1 release which should handle non-exising fields properly during validation. Could you give it a try? Thanks!

CollinKempkes commented 7 months ago

@ymc9 it really got better, thanks! Only one problem is remaining when using other operators like in for example:

model ValidationTest extends BaseEntity, PublicBase {
  id Int @id @default(autoincrement())
  type String
  name String

  @@validate(type in [
      'fee',
      'tax'
    ],
    'type must be one of the following values: fee, tax'
  )
}
describe('Update validation test', () => {
    it('should resolve', async () => {
      const enhancedPrisma = enhance(new PrismaClient());

      const validationTest = await enhancedPrisma.validationTest.create({
        data: {
          type: 'application_fee',
          name: 'validation name',
        },
      });

      await enhancedPrisma.validationTest.update({
        where: {
          id: validationTest.id,
        },
        data: {
          name: 'validation name updated',
        },
      });
    });
  });

Here the validationTest.update is failing. Error calling enhanced Prisma methodupdate: denied by policy: validationTest entities failed 'update' check, input failed validation: Validation error: type must be one of the following values: fee, tax

ymc9 commented 7 months ago

Oh, sorry I missed testing the in operator. There's some intricacy about it. I've made one more fix with a bunch more test cases. Also the CLI checking has been tightened up for cases that are not supposed to support. Please see if the new 1.10.3 build works. Thanks!

CollinKempkes commented 7 months ago

Thanks for the update @ymc9 it really looks even better than before! But now we are experiencing one very confusing error with @@validate and I don't know how it is connected to any change.

Basically the usage of deleted_at does not work anymore:

model ValidationTest {
  id Int @id @default(autoincrement())
  type String
  name String?

  deleted_at DateTime? @db.Timestamp(6)
  deleted_at2 DateTime? @db.Timestamp(6)

  @@validate(type in [
      'fee',
      'tax'
    ],
    'type must be one of the following values: fee, tax'
  )

  @@allow('all', true)
  @@map('validation_test')
  @@schema("public")
}

When using this model and using this test suite:

  describe('Update validation test', () => {
    it.only('should resolve', async () => {
      const enhancedPrisma = enhance(testContext.prismaClient);

      const validationTest = await enhancedPrisma.validationTest.create({
        data: {
          type: 'fee',
          name: 'validation name',
        },
      });

      await enhancedPrisma.validationTest.updateMany({
        where: {
          id: validationTest.id,
        },
        data: {
          deleted_at: new Date(),
        },
      });
    });
  });

We are receiving this error: "Error calling enhanced Prisma method updateMany: denied by policy: validationTest entities failed 'postUpdate' check, entity { id: 26 } failed policy check"

Strange thing is, when we are executing this test:

  describe('Update validation test', () => {
    it.only('should resolve', async () => {
      const enhancedPrisma = enhance(testContext.prismaClient);

      const validationTest = await enhancedPrisma.validationTest.create({
        data: {
          type: 'fee',
          name: 'validation name',
        },
      });

      await enhancedPrisma.validationTest.updateMany({
        where: {
          id: validationTest.id,
        },
        data: {
          deleted_at2: new Date(),
        },
      });
    });
  });

it works. It looks more like a general issue, but I investigated that when I remove the validation part of the model:

model ValidationTest {
  id Int @id @default(autoincrement())
  type String
  name String?

  deleted_at DateTime? @db.Timestamp(6)
  deleted_at2 DateTime? @db.Timestamp(6)

  @@allow('all', true)
  @@map('validation_test')
  @@schema("public")
}

Then this test will also work:

  describe('Update validation test', () => {
    it.only('should resolve', async () => {
      const enhancedPrisma = enhance(testContext.prismaClient);

      const validationTest = await enhancedPrisma.validationTest.create({
        data: {
          type: 'fee',
          name: 'validation name',
        },
      });

      await enhancedPrisma.validationTest.updateMany({
        where: {
          id: validationTest.id,
        },
        data: {
          deleted_at: new Date(),
        },
      });
    });
  });

It does not make any sense as it seems deleted_at is neither a reserved keyword inside prisma nor zenstack. Also its strange that it only fails when using it combined with @@validate. I hope this is enough information to work with it. Otherwise just tell me and I try to give you a better example.

CollinKempkes commented 7 months ago

Okay sorry it seems like it could be connected to our prisma extensions will investigate this further

CollinKempkes commented 7 months ago

So we checked this further and the problem seems to be some kind of findFirst call which will be added through the addition of @@validation. We are doing some manual kind of deletion checks with prisa, basically we override the findFirst (and other) methods to not find deleted properties anymore.

But after the softDeletion call it can't be found as it is already deleted. We could change it to the "zenstack way" by doing @@deny('read', deleted_at != null). But we are also using the non enhanced prisma client at some places and there we would loose the ability of doing a normal find (many other methods) and get/ update values which are not deleted without adding it to the query every time + not having too many custom queries.

Therefore our questions:

  1. Wouldn't it suffice for @@validate to only check the given input params instead of doing a findFirst after applying updates? I guess this will have good reasoning why it is done, but still we wanted to ask this question. As it seems the errors could also be thrown way earlier which could save some time if we would throw errors if the params can not be validated and we don't have to execute the queries.
  2. Is there any possibility to pass some kind of argument/ prefix into prisma queries so we can see that it is coming from zenstack? With this option we could bypass our check for deleted_at and maybe also on other places. For example, is it possible to override the prisma transaction id manually and give it some prefix? When looking into the arguments of findOne we can see __internalParams and inside of this attribute we have the transaction which has this form:
    {
    "kind": "itx",
    "id": "cltn0zdrd00013r0ekdelheoz",
    }

Just for a better understanding, this is what our current impl. of findOne looks like:

query: {
        $allModels: {
          async findFirst({ model, args, query }) {
            const isDeleteable = doesModelIncludeField(model, 'deleted_at');

            if (isDeleteable) {
              if (containsAttribute('deleted_at', args.where || {})) {
                return query(args);
              }

              args.where = { ...args.where, deleted_at: null };
              return query(args);
            }
            return query(args);
          },
        },
      },
ymc9 commented 7 months ago

I think I've understood the issue. This is where ZenStack enhancement and client extensions can have some unexpected interactions ...

Just to explain what the findFirst is for. ZenStack may do a read-back after a mutation for several reasons, including:

  1. If the model under mutation has "read" policies, it reads the result back to check if it still satisfies read policies so it can be returned to the caller.
  2. If there are data validation rules (both field-level and model-level), it reads the result back to validate against them (data validation rules are like "invariants" that should hold after mutations).
  3. If there are post-update policies (rules involving future() calls), the result is read back to check it.

I did some debugging and also skimmed through some of Prisma's code but couldn't find a way to "tag" the operations initiated by ZenStack. Transaction ID looks like a nice way to do it, but I couldn't find a way to influence it ... I also tried to inject some bogus fields into args, but Prisma does pretty strict checking against the input.

In your case, is it possible to enhance the original PrismaClient without the client extension installed?

CollinKempkes commented 7 months ago

To further proceed we just switched to the recommended way of doing soft deletes from zenstack. So with this change everything is working fine, I guess this issue can be closed. Thank you so much @ymc9! :)

ymc9 commented 7 months ago

To further proceed we just switched to the recommended way of doing soft deletes from zenstack. So with this change everything is working fine, I guess this issue can be closed. Thank you so much @ymc9! :)

Thank you for helping us find these issues. Really appreciate it!