Using the batch process script openlibrary-data-chunk-process.py and following it with the openlibrary-db.sql only loads 2000 rows per created table and seems to stop after that. Postgres version is 15.2, everything else is fine and the processed files are in the data/processed folder.
Using the batch process script openlibrary-data-chunk-process.py and following it with the openlibrary-db.sql only loads 2000 rows per created table and seems to stop after that. Postgres version is 15.2, everything else is fine and the processed files are in the data/processed folder.