anlek / mongify

Mongify allows you to map your data from a sql database and into a mongodb document database.
http://github.com/anlek/mongify
MIT License
317 stars 82 forks source link

Not able to process the large data (more than 10k records) when processing using batch_size from Oracle #161

Open aravinreizend opened 6 years ago

aravinreizend commented 6 years ago

Hi

Note: I have upgraded to 1.4 using https://github.com/anlek/mongify/pull/148

I am facing below issue for more than 2 weeks. Tried many ways to find the issue but cant able to fix. Please help us to fix the issue. For small tables it is working without batch_size definition.

If i change the batch_size to my total count it will work for small tables. But when 2 million total records, when i set batch_size to 50000 it is not working. But first 50000 records has been processed successfully , remaining records are marked as complete but it did not copy to my mongodb. In the below oracle.config file i tried to change the numbers but it is not working for large tables.

Here is .config file

file name : oracle.config

sql_connection do adapter "oracle_enhanced" host "xxx.xx.x.xxx" port "1525" username "**" password "***" database "DEV"

batch_size 20000 # This is defaulted to 10000 but in case you want to make that smaller (on lower RAM machines)

Uncomment the following line if you get a "String not valid UTF-8" error.

encoding "utf8" end

mongodb_connection do host "xxx.xx.x.xxx" database "eastern_oracle"

Uncomment the following line if you get a "String not valid UTF-8" error.

encoding "utf8" end

file: oracle_translation.rb

table "ra_customer_trx_all" do column "customer_trx_id", :key, :as => :integer column "trx_number", :string column "trx_date", :string column "exchange_rate", :string end

i have attached the error as screenshot here issue

panchalhitesh commented 5 years ago

How to configure oracle 12c driver to used for the source database.

Please help me with a steps or a driver type required to configure it.