Closed Derek52722 closed 9 months ago
It's strange because MEMO fields are stored in separate file, and the code you attached seems to work in the same file. It would be very helpfull if you can provide a sample file (no confidential data please!!!) to test this concrete case.
ok, here is a sample file: https://drive.google.com/open?id=1r7Ty_yx0IT0y2dR3l7bNsPRrWK8Kk3Nm
Seems memo file (DBT) is missing.
Maybe this helps you:
https://github.com/albfernandez/javadbf#reading-a-dbf-file-with-memo-file
I have seen the this doc before, but I wonder if the DBT file is necessary? I also used Pentaho (also known as Kettle, an ETL tool) to import data from DBF file to Oracle, and it don't need DBT file. Here is the jar file it used, https://drive.google.com/open?id=14spBYw1KujswqiELxy6SUv6NusbsxUeX (Sorry, I cannot find the source file, maybe a java decompiler is required). Thanks a lot for your patience!!!
I have a dbf file and want to import it to Oracle. And there is a column with type of BLOB. When I use DBFReader to read this column, I found that this column was treaded as MEMO filed and DBFReader try to convert it to Number. But meanwhile, I use Pentaho to import the DBF file to oracle, the memo filed was correctly treated as Binary. I compared the differnce between javadbf 1.7.1 and javadbf used by Pentaho (javadbf-20081125.jar), the later convert the MEMO filed as below:
where 77 is MEMO.
What can I do to import my dbf file into oracle correctly?