Open sevetseh28 opened 6 years ago
This seems to be a bug with the Microsoft Driver. I tried with FreeTDS and the row was inserted with no problems.
Glad to hear that there is a workaround by using another driver. I will have to investigate to find out what the invalid precision value would be.
Unfortunately this Workaround does not help me as I am on a windows box and cannot use FreeTDS. This GiST provides you with test code. It doesn't matter if I use "SQL Server" ODBC Driver or "ODBC Driver 13 for SQL Server". The error is the same with both drivers.
hi @MathMagique, any news with that error? I am facing on the same error.
Sorry, no news on this one so far. I had too little time in the past to investigate.
I have the same problem. @MathMagique, do you have some news about this error ?
turbodbc.exceptions.DatabaseError: ODBC error
state: HY104
native error code: 0
message: [Microsoft][SQL Server Native Client 11.0]Invalid precision value
Sorry, not yet :-(
me too:
DatabaseError: ODBC error state: HY104 native error code: 0 message: [Microsoft][ODBC Driver 17 for SQL Server]Invalid precision value
Hello @MathMagique
We have the same problem when string length is more than 1600, along with that we found below exception for data with "REAL","BIT" and "DATE" datatypes. Our use case ETL program is trying to migrate data from MSSQL source to MSSQL destination. Any update on this would really help us to proceed further , please find below exception for more info.
Unable to cast Python instance to C++ type (compile in debug mode for RuntimeError: Unable to cast Python instance to C++ type (compile in debug mode for details)
Thanks, Simhadri
Hi Im using make_options:
(fetch_wchar_as_char =True,
prefer_unicode =True,
read_buffer_size=Megabytes(1000),
parameter_sets_to_buffer=200000,
large_decimals_as_64_bit_types=True,
varchar_max_character_limit=8000,
limit_varchar_results_to_max=True,
force_extra_capacity_for_unicode=True)
I have the same problem when prefer_unicode =True (message: [Microsoft][SQL Server Native Client 11.0]Invalid precision value) Change prefer_unicode =False does not give an error but does not appear in unicode characters. I'm waiting for your help. Thanks
Similar to #282
The issue even happens with the latest editions:
ceODBC.exceptions.DatabaseError: [Microsoft][ODBC Driver 17 for SQL Server]Invalid precision value
I'm using a dynamic Tuple:
cursor.executemany(SQL_INSERT_QUERY, [tuple(row) for row in batch])
Hi, where did you do these changes? Thanks.
Bug seems still open and the latest MS driver hasn't changed the issue - quite annoying for Windows users dealing with long strings
It would seem to me that it has either been fixed at latest with ODBC driver version 17, or it was never a driver error to begin with. I can confirm that inserting strings > 1600 into NVARCHAR(max) columns using ODBC bulk inserts (aka. parameter arrays) works.
In difference to the original reported error the DBMS used in this case is a community edition MSSQL 2019 server. I do not know how similar it is to Azure SQL database. Doubtful this is an MSSQL driver issue though.
I made some tests and successfully resolved the issue by implementing the following two options within turbodbc.make_options():
Both options need to be enabled. The documentation states that their activation affects performance and behaviour. However, I seek further clarification regarding how these two options specifically influence the import of long strings. I would appreciate more clarity on this matter.
If necessary, I can provide a short script that includes two CSV files to demonstrate the issue we were facing.
fetch_wchar_as_char
means it will bind a narrow encoding as a C-Type even if the relational type indicates that this column is using a wide encoding.
Speaking more of ODBC than turbodbc
in particular here:
fetch_wchar_as_char
to False
and prefer_unicode
to True
.fetch_wchar_as_char
to True
and prefer_unicode
to False
.By default turbodbc
sets both to False
. This means turbodbc
is asking the question "How is the text encoded in the database?" however the better question to ask is "What is the system encoding in the client system?". For Linux the answer is usually UTF-8 and for Windows the answer is usually a local extended ASCII standard, which is different depending on the region. As time of this writing a UTF-8 local for Windows 11 can be enabled but is considered an experimental Beta.
This would make fetching everything as wide characters which is always defined to be UTF-16 a way which works on every platform, if only all the drivers would stick to the standard. In practice Linux drivers tend to have poor implementations for wide encodings.
Hi. Im using:
Im getting this exception when I try to insert a string of length 1600+ in a nvarchar(max) column. I already tried setting to True the limit_varchar_results_to_max parameter.
Thanks.