Open mschristensen opened 3 years ago
What's the expectation of sqlboiler in this case? You mention intelligently adapting the query but there doesn't appear to be any way to do that since it's an internal postgres limit we're up against. I'm not sure what you would do to fix this other than separate it into multiple queries which to me seems to violates the principle of least surprise.
I was thinking a JOIN could be used to filter the data instead of using IN?
SELECT * FROM "isrc_usage" WHERE ("isrc_usage"."usage_report_id" = $1);
SELECT * FROM "isrc" WHERE ("isrc"."id" IN ($1,$2,$3,$4,$5,$6,$7,$8,$9,$10));
becomes
SELECT * FROM "isrc_usage"
JOIN "isrc" ON "isrc"."id" = "isrc_usage"."isrc_id"
WHERE ("isrc_usage"."usage_report_id" = $1);
This way the generated query doesn't violate the postgres parameter limit.
It should be noted that the developer can get around this issue by removing the eager load and instead querying for each result in a loop (inefficient) or implementing a query like the above and using sqlboiler's Bind()
.
At the very least would be great to document that using an eager load is dangerous when using All()
where n>65535 - (although at this point pagination should of course be considered).
Thanks for your work on this excellent library!
@mschristensen that doesn't exactly work because of the implementation of eager loading. Each level of the tree of objects gets grouped together and queried in a big batch. Hence we have all these IDs loaded into memory already and have already at this point lost the ability to inner join. Using inner joins in the eager loading bits where appropriate (where a human would use it) is a long standing wishlist item of sqlboiler users, but it can't always be used because of join tables and sqlboiler's need to assign items to .R
structs etc.
@aarondl No worries, thanks for the explanation. In that case think it would be sufficient to document the n>65535 limitation on eager loading and then this issue can be closed.
Thanks very much for your time!
If you're having a generation problem please answer these questions before submitting your issue. Thanks!
What version of SQLBoiler are you using (
sqlboiler --version
)?SQLBoiler v4.2.0
What is your database and version (eg. Postgresql 10)
Postgresql 10
If this happened at generation time what was the full SQLBoiler command you used to generate your models? (if not applicable leave blank)
If this happened at runtime what code produced the issue? (if not applicable leave blank)
What is the output of the command above with the
-d
flag added to it? (Provided you are comfortable sharing this, it contains a blueprint of your schema)Please provide a relevant database schema so we can replicate your issue (Provided you are comfortable sharing this)
Further information. What did you do, what did you expect?
In this example, if there are 10 ISRC usages for the given usage_report_id, then eager loading the ISRCs generates queries like:
The latter query will fail if the number of ISRCs exceeds a 16 bit integer (which pg uses to count query parameters), resulting in an error like:
It would be fantastic if SQL boiler could intelligently adapt the generated query if the # parameters is too large.