Open aachaemenes opened 2 weeks ago
It's worth noting that the running query snapshots don't aggregate as such over time. They represent a point-in-time snapshot of what was running at the time the collection was run. In the next collection, some queries from the previous snapshot might still be running and be double-counted. Countless queries will have ran and completed in-between collections that don't show up at all. They are particularly useful because they give you a snapshot into what was occurring at a point in time - great for identifying things like tempdb contention, blocking etc.
Some insight can be gained by querying across the collected snapshots as you suggested. The summary of the snapshots collected is useful but might not always provide what is needed. I typically query the repository database directly in such situations, but there is maybe a need to have something built in.
If you have something specific in mind, you could always create a custom report. The dbo.RunningQueriesInfo view will do a lot of the heavy lifting.
Let say the security wants a report for the past hour for some forensic work Instead of sending them 60 files I can send them one file with all the transactions under custom report.
On Thu, Oct 10, 2024, 2:53 PM David Wiseman @.***> wrote:
It's worth noting that the running query snapshots don't aggregate as such over time. They represent a point-in-time snapshot of what was running at the time the collection was run. In the next collection, some queries from the previous snapshot might still be running and be double-counted. Countless queries will have ran and completed in-between collections that don't show up at all. They are particularly useful because they give you a snapshot into what was occurring at a point in time - great for identifying things like tempdb contention, blocking etc.
Some insight can be gained by querying across the collected snapshots as you suggested. The summary of the snapshots collected is useful but might not always provide what is needed. I typically query the repository database directly in such situations, but there is maybe a need to have something built in.
If you have something specific in mind, you could always create a custom report. The dbo.RunningQueriesInfo view will do a lot of the heavy lifting.
— Reply to this email directly, view it on GitHub https://github.com/trimble-oss/dba-dash/issues/1067#issuecomment-2406105199, or unsubscribe https://github.com/notifications/unsubscribe-auth/BGN5RX3WJQHN7UEYTHIGCQ3Z23ZLZAVCNFSM6AAAAABPXSFV7GVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDIMBWGEYDKMJZHE . You are receiving this because you authored the thread.Message ID: @.***>
DBA Dash isn't a security tool. The running queries snapshot captures what queries happen to be in flight at the time the snapshot was ran. Depending on what you are looking for, it might be able to provide answers in certain situations, but it's not suitable as an auditing tool.
You can create a custom report to do this if you want to dump all the data from a particular period. Here is an example that only took a couple of minutes - faster than exporting 60 files. Some more time could be spent on formatting, adding extra parameters to filter on etc. It's a starting point that you can customize to your requirements. The report works on a single instance so it will be available under the reports node of an individual instance.
/*
Running Queries
Custom report for DBA Dash.
http://dbadash.com
Generated: 2024-10-13 14:50:50
*/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE OR ALTER PROC [UserReport].[RunningQueries](
@InstanceID INT,
@FromDate DATETIME2,
@ToDate DATETIME2
)
AS
SELECT SnapshotDateUTC AS SnapshotDate,
Duration,
batch_text,
text,
query_plan,
object_id,
object_name,
session_id,
command,
status,
wait_time,
wait_type,
TopSessionWaits,
blocking_session_id,
BlockingHierarchy,
BlockCountRecursive,
BlockWaitTimeRecursiveMs,
BlockWaitTimeRecursive,
BlockCount,
IsRootBlocker,
BlockWaitTimeMs,
BlockWaitTime,
cpu_time,
logical_reads,
reads,
writes,
granted_query_memory_kb,
percent_complete,
open_transaction_count,
transaction_isolation_level,
login_name,
host_name,
database_id,
database_name,
database_names,
program_name,
job_id,
job_name,
client_interface_name,
start_time_utc,
last_request_start_time_utc,
last_request_end_time_utc,
last_request_duration,
sleeping_session_idle_time_sec,
sleeping_session_idle_time,
sql_handle,
plan_handle,
query_hash,
query_plan_hash,
[Duration (ms)],
wait_resource,
wait_resource_type,
wait_database_id,
wait_file_id,
wait_page_id,
wait_object_id,
wait_index_id,
wait_hobt,
wait_hash,
wait_slot,
wait_is_compile,
page_type,
wait_db,
wait_object,
wait_file,
login_time_utc,
has_plan,
statement_start_offset,
statement_end_offset
FROM dbo.RunningQueriesInfo
WHERE InstanceID = @InstanceID
AND SnapshotDateUTC >= @FromDate
AND SnapshotDateUTC < @ToDate
GO
/* Report customizations in GUI */
DELETE dbo.CustomReport
WHERE SchemaName = 'UserReport'
AND ProcedureName = 'RunningQueries'
INSERT INTO dbo.CustomReport(SchemaName,ProcedureName,MetaData)
VALUES('UserReport','RunningQueries','{
"ReportName": "Running Queries",
"TriggerCollectionTypes": [],
"CustomReportResults": {
"0": {
"ColumnAlias": {},
"CellFormatString": {},
"DoNotConvertToLocalTimeZone": [],
"ColumnLayout": [],
"ResultName": "Result0",
"LinkColumns": {
"batch_text": {
"$type": "TextLinkColumnInfo",
"TargetColumn": "batch_text",
"TextHandling": "SQL"
},
"text": {
"$type": "TextLinkColumnInfo",
"TargetColumn": "text",
"TextHandling": "SQL"
},
"query_plan": {
"$type": "QueryPlanLinkColumnInfo",
"TargetColumn": "query_plan"
}
},
"CellHighlightingRules": {
"blocking_session_id": {
"Rules": [
{
"Status": 1,
"Value1": "0",
"ConditionType": "GreaterThan"
},
{
"Status": 4,
"Value1": "0"
}
],
"TargetColumn": "blocking_session_id"
}
}
}
}
}')
This is great. Having this as a report in the dash can be really really helpful in the future.
On Sun, Oct 13, 2024, 6:56 AM David Wiseman @.***> wrote:
You can create a custom report to do this if you want to dump all the data from a particular period. Here is an example that only took a couple of minutes - faster than exporting 60 files. Some more time could be spent on formatting, adding extra parameters to filter on etc. It's a starting point that you can customize to your requirements. The report works on a single instance so it will be available under the reports node of an individual instance.
/ Running Queries Custom report for DBA Dash. http://dbadash.com Generated: 2024-10-13 14:50:50 / SET ANSI_NULLS ON GOSET QUOTED_IDENTIFIER ON GO CREATE OR ALTER PROC [UserReport].[RunningQueries]( @InstanceID INT, @FromDate DATETIME2, @ToDate DATETIME2 )ASSELECT SnapshotDateUTC AS SnapshotDate, Duration, batch_text, text, query_plan, object_id, object_name, session_id, command, status, wait_time, wait_type, TopSessionWaits, blocking_session_id, BlockingHierarchy, BlockCountRecursive, BlockWaitTimeRecursiveMs, BlockWaitTimeRecursive, BlockCount, IsRootBlocker, BlockWaitTimeMs, BlockWaitTime, cpu_time, logical_reads, reads, writes, granted_query_memory_kb, percent_complete, open_transaction_count, transaction_isolation_level, login_name, host_name, database_id, database_name, database_names, program_name, job_id, job_name, client_interface_name, start_time_utc, last_request_start_time_utc, last_request_end_time_utc, last_request_duration, sleeping_session_idle_time_sec, sleeping_session_idle_time, sql_handle, plan_handle, query_hash, query_plan_hash, [Duration (ms)], wait_resource, wait_resource_type, wait_database_id, wait_file_id, wait_page_id, wait_object_id, wait_index_id, wait_hobt, wait_hash, wait_slot, wait_is_compile, page_type, wait_db, wait_object, wait_file, login_time_utc, has_plan, statement_start_offset, statement_end_offsetFROM dbo.RunningQueriesInfoWHERE InstanceID = @InstanceIDAND SnapshotDateUTC >= @FromDate AND SnapshotDateUTC < @ToDate GO/ Error scripting pickers Value cannot be null. (Parameter 'source') / / Report customizations in GUI /DELETE dbo.CustomReportWHERE SchemaName = 'UserReport'AND ProcedureName = 'RunningQueries' INSERT INTO dbo.CustomReport(SchemaName,ProcedureName,MetaData)VALUES('UserReport','RunningQueries','{ "ReportName": "Running Queries", "TriggerCollectionTypes": [], "CustomReportResults": { "0": { "ColumnAlias": {}, "CellFormatString": {}, "DoNotConvertToLocalTimeZone": [], "ColumnLayout": [], "ResultName": "Result0", "LinkColumns": { "batch_text": { "$type": "TextLinkColumnInfo", "TargetColumn": "batch_text", "TextHandling": "SQL" }, "text": { "$type": "TextLinkColumnInfo", "TargetColumn": "text", "TextHandling": "SQL" }, "query_plan": { "$type": "QueryPlanLinkColumnInfo", "TargetColumn": "query_plan" } }, "CellHighlightingRules": { "blocking_session_id": { "Rules": [ { "Status": 1, "Value1": "0", "ConditionType": "GreaterThan" }, { "Status": 4, "Value1": "0" } ], "TargetColumn": "blocking_session_id" } } } }}')
— Reply to this email directly, view it on GitHub https://github.com/trimble-oss/dba-dash/issues/1067#issuecomment-2408989976, or unsubscribe https://github.com/notifications/unsubscribe-auth/BGN5RX6KKPWVI7S6OZI2SYTZ3J3YNAVCNFSM6AAAAABPXSFV7GVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDIMBYHE4DSOJXGY . You are receiving this because you authored the thread.Message ID: @.***>
Can we add a feature so when we do some forensic work using minute to minute transaction list, to export all the transactions into one excel file? Right now I have export 60 files for one hour transaction time frame. it would nice to have it as one big file. thanks.