Closed AnsgarLichter closed 6 months ago
@BobdenOs are we aware of this limitation? Is there a workaround available?
@AnsgarLichter the purpose of the .csv
files is to fill in tables with static
content. Usually things like countries
or currencies
. Most of the time these dataset don't go into the millions of rows. So I don't think this is a realistic issue.
I do think that it is important to consider the jsonb
limitation in general. Therefor I have made a PR that switches back to the json
type. Which I have tested does not come with this limitation, but does seem to slow down when going over the size limit of jsonb
. When running the test locally the Postgres
docker instance was not enjoying the amount of data. With messages like:
checkpoints are occurring too frequently (25 seconds apart)
If you could please give the change a try to verify that it also works with your project.
Additional errors encountered while testing and using jsonb
:
@BobdenOs is this change included in v1.7.0 of the PostgreSQL adapter? In the release notes this change was not mentioned. If yes, I can test it.
@AnsgarLichter it is not yet released. It is currently planned to be included with 1.7.1.
: https://github.com/cap-js/cds-dbs/pull/587
Description of erroneous behaviour
I am running a PostgreSQL instance locally and want to test some things with the PostgreSQL for which I do need some amount of data in my database.
In the respective entity I do have more than 1 million entries which look like this:
I generate the data with a custom script and save it to a CSV file. When deploying this to SQLite everything works fine. When I try to deploy this to Postgres I get the following error message:
As I found out through google this is a hard limit of the PostgreSQL. Is there a workaround how to deploy such huge files or is it possible to split the queries during deployment if the jsonb array gets too large?
Detailed steps to reproduce
Details about your project