Closed chetan2211 closed 1 month ago
Use -e MYTABLE
to exclude the dropped table from the export.
Thank you for the reply @darold I got your point but my concern is while doing it for the first time we don't have any idea if a specific table is there or not . And in case of huge number of table it would be difficult to do this what you are saying.. So, what we want is whenever we will be importing the data the tables that does not exist should get logged in a file but the process of importing should not get stopped. And later we can exclude the tables which get logged but atleast the table which exist will get imported.
Commit 2a005df allows online data migration to continue if the destination table doesn't exist and ON_ERROR_STOP is disabled.
The data are not saved to disk, if the table doesn't exists saving data is useless.
Hi @darold , I have four tables in oracle while importing data purposefully we drop one table so skipping that table i want import remaining 3 table data in postgres which is not happening there. I set below parameters for continue data importation from oracle to postgres. Parameters : LOG_ON_ERROR 1 STOP_ON_ERROR 0 E:\ora2pg-24.0>ora2pg -t copy -c ora2pg_file.dist -l table_copy_290524.log [========================>] 4/4 tables (100.0%) end of scanning. DBI::db=HASH(0x2020f31ec38)->disconnect invalidates 1 active statement handle (either destroy statement handles or call finish on them before disconnecting) at C:/Strawberry/perl/site/lib/Ora2Pg.pm line 14987. Aborting export...
Kindly suggest the further action. Thank you
Regards, Chetan