Due to the typed nature of GraphQL, and the baked in validation, I think I can safely assume that my GraphQL resolvers are receiving the fields and datatypes they need. Therefore I can use the input object to generate the insert statement (for example, INSERT INTO (Object.keys(input)), escaped of course).
However, I do believe there will be a large cost to this operation for batch inserts as I cannot be certain users will always provide the same set of data. I will need to map over each object to generate the insert, then concat those SQL statements. Granted, I will not be inserting a lot of data this way (with the exception of GPS data). Instead, I should coerce empty fields to null that way I can us the same insert into clause, then generate the values. This can be done with pgp.helpers.ColumnSets. I need to declare a default for each field when creating the ColumnSet.
Due to the typed nature of GraphQL, and the baked in validation, I think I can safely assume that my GraphQL resolvers are receiving the fields and datatypes they need. Therefore I can use the input object to generate the insert statement (for example,
INSERT INTO (Object.keys(input))
, escaped of course).However, I do believe there will be a large cost to this operation for batch inserts as I cannot be certain users will always provide the same set of data. I will need to map over each object to generate the insert, then concat those SQL statements. Granted, I will not be inserting a lot of data this way (with the exception of GPS data). Instead, I should coerce empty fields to null that way I can us the same insert into clause, then generate the values. This can be done with
pgp.helpers.ColumnSets
. I need to declare a default for each field when creating the ColumnSet.