aws-amplify / amplify-category-api

The AWS Amplify CLI is a toolchain for simplifying serverless web and mobile development. This plugin provides functionality for the API category, allowing for the creation and management of GraphQL and REST based backends for your amplify project.
https://docs.amplify.aws/
Apache License 2.0
88 stars 76 forks source link

Amplify API - Add support for DynamoDB batch operations. #443

Open ghost opened 5 years ago

ghost commented 5 years ago

Amplify API should add support for bachOperations - i.e. generate the appropriate resolvers and schema.graphql for DynamoDB (and possibly Aurora Serverless in the future).

The support could follow the AWS own AppSync tutorial: https://docs.aws.amazon.com/appsync/latest/devguide/tutorial-dynamodb-batch.html

Once supported, the applications can use those batch operations to replace suboptimal multiple single item operations.

The current graphql subscriptions should not be broken (i.e. when multiple items are created/updated/deleted via batch operation, the application should still get the appropriate notifications about those items via subcription).

What could be discussed - best between AWS Amplify API and AWS AppSync teams - is whether some new "batch" subcriptions can be added to the API (so that application can get notification about multiple created/updated/deleted items via one subscription message).

mikeparisstuff commented 5 years ago

Thanks for the feedback. I will record this feature request so that it can get prioritized. Philosophically there is no reason we can't support batch operations out of the box but doing so will also require adding support via the @auth & @versioned directives as well. As a mitigation, you will be able to write your own resolvers that target the full AppSync surface area as soon as this PR is merged (#581).

What could be discussed - best between AWS Amplify API and AWS AppSync teams - is whether some new "batch" subcriptions can be added to the API (so that application can get notification about multiple created/updated/deleted items via one subscription message).

You can do this today as long as you specify the same return types for the mutation and subscription. For example you can do the following:

type Mutation {
  batchPut(items: [ItemInput]): [Item]
}
type Subscription {
  onBatchPut: [Item] @aws_subscribe(mutations:"batchPut")
} 
kaustavghosh06 commented 5 years ago

@silvesteradamik We've added functionality for custom resolvers. Please use npm install -g @aws-amplify/cli to install the latest version of the CLI. For documentation regarding it, please refer to https://aws-amplify.github.io/docs/cli/graphql#overwrite-a-resolver-generated-by-the-graphql-transform

Here's the launch announcement for the same - https://aws.amazon.com/blogs/mobile/amplify-adds-support-for-multiple-environments-custom-resolvers-larger-data-models-and-iam-roles-including-mfa/

Maybe you could try out custom resolvers and see if it solves your use case for the time being. Let me know if you've any concerns around it.

tafelito commented 5 years ago

@mikeparisstuff what would be the best way to overcome the 25 max limit in dynamoDB when working with batch updates? Can you split the operations in the resolver?

JenilynnB commented 4 years ago

You can do this today as long as you specify the same return types for the mutation and subscription. For example you can do the following:

type Mutation {
  batchPut(items: [ItemInput]): [Item]
}
type Subscription {
  onBatchPut: [Item] @aws_subscribe(mutations:"batchPut")
} 

@mikeparisstuff should this example actually work today? I have this exact thing set up and the subscription does not fire on batch mutations. The batch mutations are working correctly and other subscriptions on single item mutations also fire, but I have not been able to get this to work. Would love to know whether I should continue to troubleshoot this or whether the functionality does not actually exist.

loganpowell commented 4 years ago

I love amplify so far. As soon as I saw what I need to do to do batch reads/writes I thought to myself, I bet the team will fix this one day (I know you guys are serious about DX). I.e., .vtl isn't exactly a common language among front-end engineers...

nathanqueija commented 4 years ago

Same here. I tried to read 3 different articles + the docs (this is all I could find on the internet related to batch operations ) and I still have no idea how I can accomplish this. Super complicated and I tried a lot to deep dive into vtl and tried to understand the config file in the stacks folder with a lot of terms I have no idea what they mean. No success.

brianjychan commented 4 years ago

@nathanqueija following this guide worked for me. Agreed that .vtl is quite opaque though. https://medium.com/@jan.hesters/creating-graphql-batch-operations-for-aws-amplify-with-appsync-and-cognito-ecee6938e8ee

Thanks @janhesters

A few gotchas I ran into:

GMithapara-Fintech commented 4 years ago

You can do this today as long as you specify the same return types for the mutation and subscription. For example, you can do the following:


 type Mutation {
   batchPut(items: [ItemInput]): [Item]
 }

 type Subscription {
   onBatchPut: [Item] @aws_subscribe(mutations:"batchPut")
 } 

@mikeparisstuff : should this example actually work today? I have this exact thing set up and the subscription does not fire on batch mutations. Can you please provide a proper solution for subscription on batch mutation?

idobleicher commented 3 years ago

it's really complicated. and what @mikeparisstuff offered. will not work.

At first, you will need the mutation, but in order for all process will work. You will need also : 1) Custom VTL for req, res for batch write. 2) Also Custom resource that will attach the right resolver to the dataSource

Amazon provides documentation of how to do it: https://docs.aws.amazon.com/appsync/latest/devguide/tutorial-dynamodb-batch.html

But it's still no perfect. if you have the issue that I have which is multiply envs. I haven't found yet hermetic way to pass the APP_ID and the ENV to the VTL file. in order to write to the right table.

Please follow the guide I posted. BTW if you struggle I will post a fully working example. just a lot of files and configuration. I need to change before :)

Steps: 1) Graphql Schema with input and response list of the type. 2) Custom Resources (Pipeline resolver in order to put - env and AppId in the params) 3) Custom resources of the actions you want to do 4) Plug and Play! Of course, activate Cloudwatch and X-ray to understand what going on. and for permission check for the relevant rule.

ryarasi commented 3 years ago

@nathanqueija following this guide worked for me. Agreed that .vtl is quite opaque though. https://medium.com/@jan.hesters/creating-graphql-batch-operations-for-aws-amplify-with-appsync-and-cognito-ecee6938e8ee

Thanks @janhesters

A few gotchas I ran into:

  • Make sure to replace the table name with your table name in the vtl files
  • If you use a key other than 'todos' in the Mutation, make sure that matches in the vtl files
  • In CustomResources.json, the value that DataSourceName should have can be found by running amplify console api and clicking "Data Sources" on the left

So I'm trying to get Jan Hester's medium article to work but I'm running into an issue.

From what I can gather, we have to change the name of the resource itself (i.d. BatchAddTodosResolver), "DataSourceName" and "FieldName" . I updated all three to match my use case. I'm not sure if we're supposed to modify other variables in that json. But after updating those 3, I still get an error when I push the change with amplify push.

CREATE_FAILED BatchAddTodosResolver AWS::AppSync::Resolver Thu May 06 2021 11:08:52 GMT The specified key does not exist. (Service: Amazon S3; Status Code: 404; Error Code: NoSuchKey; Request ID: ; S3 Extended Request ID: ; Proxy: null)

Not sure if I'm missing something. Please let me know what I should do to get around this.

maziarzamani commented 2 years ago

I am surprised this feature is not supported yet.

alex-vladut commented 2 years ago

I agree too that it would be quite useful to have Amplify generate a mutation for creating multiple entities of the same type. To provide some context, I worked with Hasura on a project and I found the mutation they provide is very useful in practice, here is an example of what they looks like: https://hasura.io/docs/latest/graphql/core/databases/postgres/mutations/insert.html#insert-multiple-objects-of-the-same-type-in-the-same-mutation. In addition to that, it may be useful to wrap the write requests in a DynamoDB transaction so that either all the entities are saved or none.

Another variation I found useful is to extend the create entity mutation to include creating related objects in a single request. Let's consider this simple example with a design of a hierarchical model where a Message could be of the type either Email or Sms:

enum MessageType {
  SMS
  EMAIL
}

type Message @model {
  id: ID! @primaryKey
  type: MessageType!
  sms: MessageSms @hasOne(fields: ["id"])
  email: MessageEmail @hasOne(fields: ["id"])
}

type MessageSms @model
  ) {
  id: ID! @primaryKey
  content: String!
}

type MessageEmail @model
  ) {
  id: ID! @primaryKey
  content: String!
}

I would like to be able to save such a message in one go, so a mutation could be expressed in this way:

mutation createSmsMessage {
  createMessage(input: {
    type: SMS
    sms: {
      content: "Hello"
    }
  }) {
    id
  }
}

The relationships between entities in GraphQL could be nested multiple levels, but I think generating the mutation resolver to support the first level of dependencies is more than enough for the use cases I came across. Thanks for your work, let me know if there is anything I could help with to move this feature forward (in case you think it is useful).

vincent38wargnier commented 2 years ago

Any update?

raph commented 1 year ago

Please prioritize this, it will help a lot making the datastore faster and more responsive.

pjsandwich commented 1 year ago

Talking with the Amplify team it doesn't sound like this is a priority, however you can write batch operation resolvers to your dynamodb tables in a lambda function with an express app on it which uses the batch put item function. It is a pain to do it for multiple tables or even tables with nested relationships but relatively straight forward for single table use cases.

pjsandwich commented 1 week ago

@josefaidt I think this issue can be closed as resolved: feature support