Closed ElenaShlykova closed 1 year ago
Hello @ElenaShlykova, Thank you for reaching out to ZZZ Projects, due to the holiday vacation we will have limited email access which may cause some delays from December 23 to January 2. We are sorry for any inconvenience.
Hello @ElenaShlykova ,
The current error looks to be caused by a timeout.
Approximately how long does it takes without this nvarchar(max)
column/with this column but without the auditing, and do you know if there is something such as a trigger that could explain why this is so long?
Is it possible for you to reproduce it in a standalone project that contains the minimum code so we can try it? We already have multiple unit test with a nvarchar(max)
column, so there is surely something we are missing here. You can send the project in private here if needed: info@zzzprojects.com
Best Regards,
Jon
It looks like the problem is in the generated sql. I got sql generated by dapper.plus using SqlProfiler and got the same error while running this script in Microsoft Sql Management Studio. This is the SQL:
exec sp_executesql N'MERGE INTO [mytable] AS DestinationTable
USING
(
SELECT TOP 100 PERCENT * FROM (SELECT @0_0 AS [id], @0_1 AS ZZZ_Index) AS StagingTable ORDER BY ZZZ_Index
) AS StagingTable
ON DestinationTable.[id] = StagingTable.[id]
WHEN MATCHED THEN
DELETE
OUTPUT
$action,
StagingTable.ZZZ_Index,
DELETED.[payload] AS [payload_zzzdeleted]
;',N'@0_0 bigint,@0_1 nvarchar(19),@0_2 int',@0_0=20466,@0_1=0
This is like a Sql server limitation and it throws an error if we have a huge column in the output. It works once I removed the payload column from the output.
I also found a workaround with dapper.plus. Everything works if I set ForceSelectOutput = true
for bulkOperation. In this case, a temporary table is used to retrieve the old values. Please let me know if this is the best solution.
Thank you for reaching out to ZZZ Projects, due to the holiday vacation we will have limited email access which may cause some delays. We are sorry for any inconvenience.
Hello @ElenaShlykova ,
Thank you for the additional information. We will look more into it and why it happens.
Surely meanwhile, ForceSelectOutput
is a great solution. As you probably already have seen, it will OUTPUT
values in a variable table before selecting them after.
I will try to give you an update very soon
Hello @ElenaShlykova ,
We are probably missing something, but my developer was not able to reproduce it. We tested multiple various difference scenarios with very large content, and outputting directly or in a table was pretty much the same performance (mostly 1-3% difference)
Do you think you could create a runnable project / or providing SQL script that reproduce this performance issue? It doesn’t need to be your project, just a new solution with the minimum code to reproduce the issue. You can send it in private here: info@zzzprojects.com
Hello @ElenaShlykova
Unfortunately, since we didn't hear from you I will close this issue.
As previously mentioned, we need a runnable project to be able to assist you.
We will reopen the issue if a project is received.
Feel free to contact us for questions, issues or feedback.
Best Regards,
Jon
Description
I have a table with several columns and one of them is with the nvarchar(max) type. We also need to audit all changes in this table. Everything works fine until I put the large value (around 4000 chars) in the nvarchar(max) column. In this case I get the following error:
Exception
Msg 10054, Level 20, State 0, Line 0 A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - An existing connection was forcibly closed by the remote host.)
Further technical details