Closed hjb417 closed 2 years ago
When parameters are used in a SQL statement, the SQL is effectively executed in its own transaction, so the temporary table is dropped immediately afterwards. You could try using global temporary tables instead (e.g. ##SomeTable
). Or just create the local temp table in its own non-parameterized SQL, e.g. as a replacement for your third example:
conn = pyodbc.connect(conn_str)
try:
cursor = conn.cursor()
cursor.execute("CREATE TABLE #SomeTable2 (SomeCol INT NOT NULL)")
drain_cursor(cursor)
cursor.execute("INSERT INTO #SomeTable2 VALUES(?)", [3])
drain_cursor(cursor)
cursor.execute("SELECT * FROM #SomeTable2")
drain_cursor(cursor)
print("WORKS")
except Exception as e:
print(e)
print("FAIL")
conn.close()
All the executes are part of the same transaction because autocommit was implicitly set to False... Had it been set to True, all test cases would have failed.
But regardless, this is a bug... most likely in pyodbc that hopefully the author or someone else can fix.
This is the normal behaviour as Keith explains above.
Where is this documented? The only difference I can see is pyodbc uses SQLExecute when parameters are passed in and SQLExecDirect when parameters aren't. And the documentation for both say "If the data source is in manual-commit mode (requiring explicit transaction initiation) and a transaction has not already been initiated, the driver initiates a transaction before it sends the SQL statement.".
Please first make sure you have looked at:
Environment
To diagnose, we usually need to know the following, including version numbers. On Windows, be sure to specify 32-bit Python or 64-bit:
Issue
When creating temp tables, I can no longer access them if the cursor.execute function that created it was passed in arguments.
For the 3 test cases below, 1st one passes and the other two fail