Closed rohankharwar closed 6 years ago
Can we fix this for the particular case but also for the general case where we don't have a correct mapping?
I.e. mapping numbers to long/double and everything else to string?
We would need the fix by early next week.
As we still don't manage arrays, the candidate rdbms to neo4j datatype mapping could be:
switch (javaSqlTypeGroup) {
case bit:
return Neo4jDataType.Byte;
case integer:
return Neo4jDataType.Long;
case real:
return Neo4jDataType.Double;
case object:
case large_object:
case reference:
case unknown:
case binary:
return null;
case id:
case url:
case xml:
case temporal:
case character:
default:
return Neo4jDataType.String;
}
BLOB, CLOB are treated as large_object
by the schemacrawler (see here) and we esclude them from the import (by returning null
(in the above switch), otherwise the risk is having a node property as big as the content of the B/CLOB.
If ok, I can release a patch for that.
This should be fixed for some months now @rohankharwar
could you re-check?
I am using
neo4j-etl
tool to import data from Oracle 12c into Neo4j. Oracle is setup on Docker. However when I run neo4j-etl I get the following error.Here is my command