Closed JakubKad closed 1 year ago
I'm not really sure that you can avoid this type of errors completely. There will always be a number of events that you will not be able to replay, especially at the beginning of the replay itself. The ideal solution depends on what you are trying to achieve: are you trying to compare production with test, test with test? What is the purpose of the replay? Depending on the answer, you could:
Hope this helps
Hello,
After a while we have conclusion. We used MARK (Begin Transaction With Mark) to get highlight spot. Using this we restored Backup and Logs to the MARK point and started replay from the beginning (when the Capture started). Quick little question at the end. These little files after the Capture was concluded, are they containing data from the main TEMP file and are therefore to inject some small samples or just doing the check of the capture file (sqlite)?
Capture is creating big TEMP file, these cache files are there after the capture was concluded.
Edit: I know there is a CACHE option in JSON (We have not defined it). But I am not entirely sure of the purpose of these files (around 100 before we shutted all down for good, 'cause the CMD were not writing a thing anymore and only these CACHE files were popping up, we used the timer option in JSON).
Glad you sorted it out! Regarding the cache files, those are for caching events to disk before processing them. The events queue is a memory-mapped file and needs the cache to avoid using all the memory in the host. I'm surprised those file don't get deleted though, this is not the intended behavior. Thanks for reporting it, I'll have a look at the code.
Hello,
I am here again to plea for help. I have problem with replay. I will go through the steps that I have taken, and then diescribe the problem and my thought of a solution that is in my mind.
Errors
2022-12-14 11:57:36.0902 - Info - WorkloadTools.Consumer.Replay.ReplayConsumer : 1301000 events queued for replay ( 28% ) 2022-12-14 11:57:36.8018 - Info - WorkloadTools.Consumer.Replay.ReplayConsumer : 1302000 events queued for replay ( 28% ) 2022-12-14 11:57:37.7756 - Info - WorkloadTools.Consumer.Replay.ReplayConsumer : 1303000 events queued for replay ( 28% ) 2022-12-14 11:57:38.6316 - Info - WorkloadTools.Consumer.Replay.ReplayWorker : Worker [209] - 6000 commands executed. 2022-12-14 11:57:38.6316 - Info - WorkloadTools.Consumer.Replay.ReplayWorker : Worker [209] - 90000 commands pending. 2022-12-14 11:57:38.6350 - Info - WorkloadTools.Consumer.Replay.ReplayWorker : Worker [209] - Last Event Sequence: 176656 2022-12-14 11:57:38.6350 - Info - WorkloadTools.Consumer.Replay.ReplayWorker : Worker [209] - 205 commands per second. 2022-12-14 11:57:38.8016 - Warn - WorkloadTools.Consumer.Replay.ReplayWorker : Worker [84] - Sequence[383713] - Error: The INSERT statement conflicted with the FOREIGN KEY constraint "FK_RevenueRecognition_OrderItems". The conflict occurred in database "DB", table "dbo.OrderItems", column 'OrderItem_ID'. The statement has been terminated.
Description
The server is in continuous operation and cannot be put into a state where nothing is running on it, it cannot be shut down for a while. I am able to get the database to an identical state to when Capture was running. There is a problem here that I am unable to resolve. This problem is actually related to a continuous stream of data to the server, where I start Capture, but the server is still running queries that have not been completed and thus cannot be recorded in the SQLite file.
By doing this, I am not able to treat errors that may occur in Replay, as they may be working with data that is not recorded in Capture and not in Backup/Logs either. I would like to ask how should I address this issue? I was thinking of a larger time window where I can be sure that all queries have been run and go through the SQLite file and remove the Querries that are done and put into the DB (this is a very inconvenient way of solving, time consuming).
When I tried this tool on a Local instance to get familiar with the tool, I didn't have these problems, of course, and everything worked fine.
Thanks for any advice.