rdagumampan / yuniql

Free and open source schema versioning and database migration made natively with .NET/6. NEW THIS MAY 2022! v1.3.15 released!
https://yuniql.io
Apache License 2.0
418 stars 64 forks source link

I am implementing Erease Yuniql and I get the following error **This NpgsqlTransaction has completed; it is no longer usable**, the process deletes my tables in PostgreSQL (HyperScale) but flag that error. #204

Closed rdagumampan closed 3 years ago

rdagumampan commented 3 years ago

Hi @rdagumampan, I am implementing Erease Yuniql and I get the following error This NpgsqlTransaction has completed; it is no longer usable, the process deletes my tables in PostgreSQL (HyperScale) but flag that error. My tables contain large amounts of data and take a long time to delete. I am trying to delete more than 300 tables.

My script is similar to this wuth the all 300 tables. BEGIN; DROP TABLE IF EXISTS dd_dyp_pro.isrretencionesusopt CASCADE; COMMIT; BEGIN; DROP TABLE IF EXISTS dd_dyp_pro.ivapfpagodefinitivopt CASCADE; COMMIT; BEGIN; DROP TABLE IF EXISTS dd_dyp_pro.ivaprestacionservdigitales CASCADE; COMMIT; BEGIN; DROP TABLE IF EXISTS dd_dyp_pro.ivaretencionesusopt CASCADE; COMMIT; BEGIN; DROP TABLE IF EXISTS dd_dyp_pro.pd_pago CASCADE; COMMIT;

Originally posted by @somilygz in https://github.com/rdagumampan/yuniql/issues/47#issuecomment-828634387

rdagumampan commented 3 years ago

Hi @somilygz, thank you for reaching out. I must admit I have not tested yuniql on database with large amount of data. As general good practice when cleaning up database like this, it needs careful planning and sometimes best executed outside of tools like yuniql. Smaller transactions are also better than one big transaction.

In your sample code, I can see you enclose transactions per statement but I'd guess the CASCADE would trigger a cascade operations which eventually bubble up the transaction log. In SQL Server, we would typically do this kind of cleanup by truncating the tables, setting the isolation level to reduce the transaction log or deleting rows by batch before dropping the table.

Another way is to execute yuniql erase with --transaction-mode statement. This effectively disables a single transaction but instead execute each statement as atomic unit. Caveat is when execution failed, you need fix and run again from the last failed statement.

If you intend to drop all tables in database anyway, it may also be more practical to recreate the database. We do this when running regression and integration tests.

Not sure my comment helps but I would be very happy to hear if you found any work around.

rdagumampan commented 3 years ago

I would have to investigate this This NpgsqlTransaction has completed; it is no longer usable, the process deletes my tables in PostgreSQL (HyperScale). If you can tell me your pgsql platform or varian or cloud service, I'll try to find time to reproduce. Thanks for your valuable input.

maxkudosh commented 3 years ago

Hi @somilygz, @rdagumampan

I had the same issue. In my case it was caused by transaction timeout. So the migration script execution took more than default timeout, the transaction was closed and yuniql run command failed. The issue disappeared after I increased timeout via --command-timeout option