spaghettidba / WorkloadTools

A collection of tools to collect, analyze and replay SQL Server workloads, on premises and in the cloud
MIT License
238 stars 54 forks source link

Violation of PRIMARY KEY constraint 'PK_WorkloadDetails' #133

Open regis671 opened 1 year ago

regis671 commented 1 year ago

hello, when analysing an extended event converted into a sqlite, I have this error :

Info - WorkloadTools.Consumer.Analysis.WorkloadAnalyzer : Summary info written (15 rows) Warn - WorkloadTools.Consumer.Analysis.WorkloadAnalyzer : Unable to write workload analysis. Warn - WorkloadTools.Consumer.Analysis.WorkloadAnalyzer : Violation of PRIMARY KEY constraint 'PK_WorkloadDetails'. Cannot insert duplicate key in object 'Prod_1007.WorkloadDetails'. The duplicate key value is (742302267, -4258734915548242222, 0, 0, 6, 0). The statement has been terminated. Error - WorkloadTools.Consumer.Analysis.WorkloadAnalyzer : Unable to write workload analysis info to the destination database. Error - WorkloadTools.Consumer.Analysis.WorkloadAnalyzer : à System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action1 wrapCloseInAction) à System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose) à System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady) à System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj) à System.Data.SqlClient.SqlBulkCopy.RunParser(BulkCopySimpleResultSet bulkCopyHandler) à System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsyncContinuedOnSuccess(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource1 source) à System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsyncContinued(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource1 source) à System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsync(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource1 source) à System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestContinuedAsync(BulkCopySimpleResultSet internalResults, CancellationToken cts, TaskCompletionSource1 source) à System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestAsync(CancellationToken cts, TaskCompletionSource1 source) à System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalAsync(CancellationToken ctoken) à System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServerAsync(Int32 columnCount, CancellationToken ctoken) à System.Data.SqlClient.SqlBulkCopy.WriteToServer(DbDataReader reader) à WorkloadTools.Consumer.Analysis.WorkloadAnalyzer.WriteExecutionDetails(SqlConnection conn, SqlTransaction tran, Int32 current_interval_id) dans C:\github\WorkloadTools\WorkloadTools\Consumer\Analysis\WorkloadAnalyzer.cs:ligne 760 à WorkloadTools.Consumer.Analysis.WorkloadAnalyzer.WriteToServer(DateTime intervalTime) dans C:\github\WorkloadTools\WorkloadTools\Consumer\Analysis\WorkloadAnalyzer.cs:ligne 443 à WorkloadTools.Consumer.Analysis.WorkloadAnalyzer.CloseInterval() dans C:\github\WorkloadTools\WorkloadTools\Consumer\Analysis\WorkloadAnalyzer.cs:ligne 103

Here is the config json :

"Controller": {

    "Listener":
    {
        "__type": "FileWorkloadListener",
        "Source": "d:\\sqlite\\workload1.sqlite"
    },

    "Consumers":
    [
        {
            "__type": "AnalysisConsumer",
            "ConnectionInfo": 
            {
                "ServerName": "1.1.1.1",
                "DatabaseName": "replay_analysis_20G",
                "SchemaName": "Prod_1007",
                "UserName": "sa",
                "Password": "********"
            }
        }
    ]
}

}

8trackbattlecat commented 1 year ago

I was able to resolve this duplicate key error by adding the “UploadIntervalSeconds” property to the Consumer.

Example Consumers Json snippet with the fix:

    "Consumers":
    [
        {
            "__type": "AnalysisConsumer",
            "ConnectionInfo": 
            {
                "ServerName": "1.1.1.1",
                "DatabaseName": "replay_analysis_20G",
                "SchemaName": "Prod_1007",
                "UserName": "sa",
                "Password": "********"
            },
            “UploadIntervalSeconds”: 60
        }
    ]
spaghettidba commented 1 year ago

Thanks for reporting this. I'll have a look at the code.