Closed ffxsam closed 3 years ago
@jeremydaly Re. the engine
option we discussed: how do you feel about making this a breaking change, bumping to 2.0, and making engine
a required config option? I feel like defaulting to either MySQL or Postgres could be problematic.
Also, are there any type issues you (or anyone else) has encountered with MySQL that should be addressed?
Ok, cool! So this works great:
await client.query('INSERT INTO dummy(file_uuid, metadata) VALUES(:fileUuid, :metadata)', [
{
name: 'fileUuid',
value: '5600b63d-66c3-49e6-9728-6d81758c6039',
cast: 'uuid',
},
{
name: 'metadata',
value: JSON.stringify({ name: 'Bob', age: 2 }),
cast: 'jsonb',
},
])
I just need someone to test this using MySQL and help with any fixes/changes.
I'm also not sure how to make casting work with this complex syntax:
let batchInsert = await data.query(
`INSERT INTO myTable (name,age,has_curls) VALUES(:name,:age,:curls)`,
[
[{ name: 'Marcia', age: 17, curls: false }],
[{ name: 'Peter', age: 15, curls: false }],
[{ name: 'Jan', age: 15, curls: false }],
[{ name: 'Cindy', age: 12, curls: true }],
[{ name: 'Bobby', age: 12, curls: false }]
]
)
@jeremydaly Just checking in to see what you thought of this and if you have any changes, or see anything in need of correction.
Great work @ffxsam
This will help resolving some issues in this library https://github.com/ArsenyYankovsky/typeorm-aurora-data-api-driver/issues/44, but I see you active in there as well.
@harm-less Thanks! Though I don't know how much of a help it'll be to TypeORM, since the ORM doesn't pass type information to the Data API client.
It can't? 😟
But why is this required to begin with, doesn't Postgres already check types? Isn't the syntax is very similar to the syntax of a query send to a regular Postgres database. What's the difference?
But why is this required to begin with, doesn't Postgres already check types?
The Aurora Data API has problems with types, for whatever reason, and it requires more explicit casting than is needed when using a regular connection to Postgres and passing SQL queries.
I can't remember the specifics right now, but I started using TypeORM with the Aurora Data API driver, and noticed that TypeORM doesn't pass any type information to the drivers it uses. And since my PR here relies on being passed a cast
field, I don't think this would help with TypeORM at all.
(note: still need to document this, then will merge)
Hey, any update on this MR?
@tommy31 I haven't had time to write documentation for this PR yet. Anyone is welcome to pitch in on that!
I think this is ready to go. Fixed one bug, added a unit test, and added some brief documentation into the README.
@jeremydaly Does this look ok to you? If so, feel free to just squash merge it, and publish to npm!
Any update on this @jeremydaly?
Jeremy is a busy dude, so be patient. I could merge it myself, but since it's my first PR since becoming a collaborator, I wanted to run this by him. And I don't have npm publish permissions anyway.
@ffxsam, I merged this in and then made a few updates to the README to document the new engine
config parameter. I also made that parameter optional with a default to mysql
for backwards compatibility.
It's been published to npm @ v1.2.0.
@jeremydaly Awesome, thanks a bunch! I can finally remove the local copies of my PR I've been using. 😁
Thank you guys. This is a nice update!
It's clear based on several open issues that the Aurora Data API is a bit clunky when it comes to handling certain data types (
jsonb
,uuid
, etc.). This PR's goal will be to allow specifying acast
property inparameters
to allow casting certain fields as needed.If anyone has input, please contribute by commenting on this PR, and I'm happy to discuss any potential changes to the code.