dalibo / sqlserver2pgsql

Migration tool to convert a Microsoft SQL Server Database into a PostgreSQL database, as automatically as possible
http://dalibo.github.io/sqlserver2pgsql
GNU General Public License v3.0
515 stars 117 forks source link

migration.kjb Kitchen.bat Java stackoverflow #164

Closed neveratdennys closed 1 year ago

neveratdennys commented 2 years ago

I began receiving these errors after entering about 800 tables or so, and eventually figured out that it's just too many tables in one job. I ended up breaking up the migration kjb file into 3 parts and ran them separately. I just hope that is completely equivalent and there shouldn't be something I'm missing. Thanks!

Starting entry [dbo_Where_Separate] 2022/07 - ERROR (version 9.3.0.0-428, build 9.3.0.0-428 from 2022-04-12 04.56.25 by buildguy) : java.lang.StackOverflowError

Edit: I just read about the -sort_size parameter to help with Java memory issues, this is probably directly useful but I did not know about this when I encountered the issue.

neveratdennys commented 2 years ago

To give a bit more details about the error, this is encountered after the data migration step and running it in pdi. Just wanted to leave this here in case anyone else ran into this issue.

neveratdennys commented 2 years ago

I don't think it's this project's issue and just want to leave this pdi problem here for reference.

beaud76 commented 2 years ago

Thank You for having shared the experience.

As the error comes from java, is it likely linked to kettle rather than sqlserver2pgsql itself.

neveratdennys commented 2 years ago

I'm reopening this issue with more information and more testing.

It seems like the size of the generated migration.kjb file is too much at once for larger databases. My previously test ended up splitting the migration.kjb script into 3 parts, but seems like the limitation also involves more data (which my current test migration has). Right now my 3 part approach hangs on normal tables after processing for around 5 minutes.

I am having to manually break the migration script into 5 parts or so for this database, but I am expecting to perform this procedure for databases up to 10 to 20 times the size of my current one.

My current thoughts to this issue is to write a script that breaks the current generated migration script into individual tables to avoid these issues. I thought I should ask for advice here before I start doing that, I would appreciate any input! Thank you!