Closed apalacio9502 closed 4 months ago
Hi,
I was testing with ODBC 1.4.2 and I see the same problem if I define batch_rows less than the length of the data.frame.
Therefore, reviewing the commits, I see that before this commit https://github.com/r-dbi/odbc/commit/6f6d4e782e9f4e040ca38c88796af7301fb0138e the default was NA, which was equivalent to the length of the data.frame. I think that as a temporary solution, Oracle could have the default set to the length of the data.frame and in the future find the root cause of the problem
Regards,
Hello there.
Thanks for the report. Do you have the ability to test a development branch? If so, can you give https://github.com/r-dbi/odbc/pull/810 a shot? It is aimed at helping with issues related to Oracle and writing to DATE/TIMESTAMP targets.
Hi @detule,
After testing this pull request https://github.com/r-dbi/odbc/pull/810 I see that it works correctly.
Thanks for your help.
Regards,
Resolved in #810. Thanks!
Hi @hadley and @simonpcouch,
Today, I updated DBI to version 1.2.3 and ODBC to version 1.5.0. When using dbWriteTable to write a table with a date column, I encountered two errors that seem to be related to the same issue.
The first error occurs when loading the data without specifying field.types.
The second error occurs when loading the data with specified field.types.
Both issues are resolved if I set batch_rows to 2000, which matches the length of the data.frame.
I have tried to understand the cause of the problem but have not been able to isolate it. However, I can confirm that with DBI 1.2.3 and ODBC 1.4.2, the issue does not occur, leading me to believe that the problem lies with ODBC.
Currently, I am using the Posit professional driver for Oracle, version 2024.03.0.
Regards,