albfernandez / javadbf

Java library for reading and writing Xbase (dBase/DBF) files.
GNU Lesser General Public License v3.0
224 stars 98 forks source link

Why treat memo as Number?(ver 1.7.1) #49

Closed Derek52722 closed 9 months ago

Derek52722 commented 6 years ago

I have a dbf file and want to import it to Oracle. And there is a column with type of BLOB. When I use DBFReader to read this column, I found that this column was treaded as MEMO filed and DBFReader try to convert it to Number. But meanwhile, I use Pentaho to import the DBF file to oracle, the memo filed was correctly treated as Binary. I compared the differnce between javadbf 1.7.1 and javadbf used by Pentaho (javadbf-20081125.jar), the later convert the MEMO filed as below:

case 77: byte[] b_array = new byte[this.header.fieldArray[i].getFieldLength()]; this.dataInputStream.read(b_array); recordObjects[i] = new String(b_array, this.characterSetName); break;

where 77 is MEMO.

What can I do to import my dbf file into oracle correctly?

albfernandez commented 6 years ago

It's strange because MEMO fields are stored in separate file, and the code you attached seems to work in the same file. It would be very helpfull if you can provide a sample file (no confidential data please!!!) to test this concrete case.

Derek52722 commented 6 years ago

ok, here is a sample file: https://drive.google.com/open?id=1r7Ty_yx0IT0y2dR3l7bNsPRrWK8Kk3Nm

albfernandez commented 6 years ago

Seems memo file (DBT) is missing.

Maybe this helps you:

https://github.com/albfernandez/javadbf#reading-a-dbf-file-with-memo-file

Derek52722 commented 6 years ago

I have seen the this doc before, but I wonder if the DBT file is necessary? I also used Pentaho (also known as Kettle, an ETL tool) to import data from DBF file to Oracle, and it don't need DBT file. Here is the jar file it used, https://drive.google.com/open?id=14spBYw1KujswqiELxy6SUv6NusbsxUeX (Sorry, I cannot find the source file, maybe a java decompiler is required). Thanks a lot for your patience!!!