jeremydaly / data-api-client

A "DocumentClient" for the Amazon Aurora Serverless Data API
MIT License
444 stars 63 forks source link

Update to support array values #31

Open NewtrinoPiGui opened 4 years ago

NewtrinoPiGui commented 4 years ago

It's stated in the README that using array values is not currently supported. However, I'm fairly sure the AWS Data API does indeed support it now (it may not have until recently though, I'm not sure). The syntax for it is a little different than the other types as the SqlParameter type would look something like (as per the related example in the README):

{ name: 'id', value: { arrayValue: { longValues: [1,2,3,4,5] } } }

This does seem like a major limitation for using this library, and I'm not sure if this has been tried prior to this point.

One thing that would need to be considered is that there may be a limit to how big this array can be in a query depending on the DBMS. I am not sure if this library would be the best place to implement some of that logic and break apart the arrays into chunks with limited capacity, or if that should be left to the consumer.

zawarski commented 4 years ago

Is the limitation you are describing only when using named parameters? I ask because explicit values, like:

let results = await dataApiClient.query("select * from account where id IN (302, 81, 84)");

... do work with the data-api-client.

NewtrinoPiGui commented 4 years ago

Yes, I'm referring to the limitation documented in the README (linked above). I did a bit more research and it looks like there may be a known issue with the data API and supporting named parameter Array values, but I'm not sure as I haven't tried testing this directly yet, and all the AWS documentation says it should work. In my example, the corresponding SQL would look like:

select *
from account a
where a.id in (:id)

and the passed named parameter would look like:

{ name: 'id', value: { arrayValue: { longValues: [1,2,3,4,5] } } }

My point in all this, is that it looks like your example as posted in the README is incorrect, and I was wondering if you intended the array values to work as a blobValue type, or if you actually attempted to get it to work using the arrayValue syntax I described?

zawarski commented 4 years ago

I had initially thought, incorrectly, there was an issue with any use of arrays. Thanks for the clarification.

the-smart-home-maker commented 4 years ago

Is there any plan to support such arrayValues in the future with data-api-client? Would be really an important future in my opinion.

jeremydaly commented 4 years ago

I'm still getting "BadRequestException: Array parameters are not supported." using the MySQL version. If there is something I'm missing, let me know. See here: https://github.com/aws/aws-sdk-js/issues/2993

ablankenship10 commented 3 years ago

Any updates on this? I'm not quite sure how to setup a test myself using the raw SDK as I'm attempting to use via TypeORM (which would need an update to this package before working), no idea how the internals work.

Seems like this should be a basic thing they (AWS) should have supported at launch. I've tried so many different paths using AWS tools (Serverless, DynamoDB, Aurora, AppSync) and it always seems like they have a lot of features missing in these flag ship products. Seems tough to commit to using any of these products for a simple API + Database application. Am I wrong to feel that way? Node + PG just gets it done but would really like the benefits of cost savings and scaling with serverless technologies.

EDIT: Any reason the sql field in the DataAPI ExecuteStatement request can't contain all the values already instead of using the parameters field. That way it would just run a straight postgres SQL statement without the limitations of the data API supported types right?

o-alexandrov commented 3 years ago

Just fyi, the latest raw AWS SDK v3 @aws-sdk/client-rds-data doesn't support array values also. The actual service doesn't accept such requests.