Open aachaemenes opened 2 weeks ago
This information could be pulled from the error log. This could maybe be done on demand with the Messaging feature - pulling the data directly from the monitored instance. I might look into it at some point. Until then it would be possible to setup a custom collection/report for this.
Do you have a customer report already for this? It is kind of important.
On Mon, Oct 14, 2024, 1:20 PM David Wiseman @.***> wrote:
This information could be pulled from the error log. This could maybe be done on demand with the Messaging feature - pulling the data directly from the monitored instance. I might look into it at some point. Until then it would be possible to setup a custom collection/report for this.
— Reply to this email directly, view it on GitHub https://github.com/trimble-oss/dba-dash/issues/1065#issuecomment-2412141525, or unsubscribe https://github.com/notifications/unsubscribe-auth/BGN5RX5MMRF3WHOVGECWG63Z3QRR5AVCNFSM6AAAAABPXEJSFWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDIMJSGE2DCNJSGU . You are receiving this because you authored the thread.Message ID: @.***>
I use this script for one week of failed logins: https://www.mssqltips.com/sqlservertip/4941/find-all-failed-sql-server-logins/ Parsing the log files can take a while, that why it was only run on demand
I meant adding it as a custom report. Again I think it is a very valuable report that businesses care about a lot. Every morning I want to know which accounts are failing to connect to sql.
On Mon, Oct 14, 2024, 10:46 PM R4PH1 @.***> wrote:
I use this script for one week of failed logins: https://www.mssqltips.com/sqlservertip/4941/find-all-failed-sql-server-logins/ Parsing the log files can take a while, that why it was only run on demand
— Reply to this email directly, view it on GitHub https://github.com/trimble-oss/dba-dash/issues/1065#issuecomment-2412947413, or unsubscribe https://github.com/notifications/unsubscribe-auth/BGN5RX4ERG4HVDBHZBZSHZLZ3ST2BAVCNFSM6AAAAABPXEJSFWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDIMJSHE2DONBRGM . You are receiving this because you authored the thread.Message ID: @.***>
Custom reports & custom collections allow for some significant customization without writing a single line of C# - allowing you to create bespoke customizations using only T-SQL. I'm hoping that users will create things for themselves and ideally share them here for other users to benefit from.
That said, this could be a good example for users to take inspiration from.
The first thing you need is a proc to capture the failed logins on your monitored instances. This will get the failed logins from the last day. I'm converting the LogDate to UTC to make things easier as dates are expected to be stored in UTC format - then converted to local time in the app (The default timezone conversion could be disabled in the report if you wanted to store in local time for some reason).
CREATE OR ALTER PROC dbo.FailedLogins
AS
CREATE TABLE #ErrorLog(
LogDate DATETIME,
ProcessInfo NVARCHAR(50),
Text NVARCHAR(MAX)
)
CREATE TABLE #enum(
ArchiveNumber INT PRIMARY KEY,
LogDate DATETIME,
LogSize BIGINT
);
INSERT INTO #enum
EXEC sys.sp_enumerrorlogs;
DECLARE @ArchiveNumber INT
DECLARE c1 CURSOR FAST_FORWARD LOCAL FOR SELECT ArchiveNumber
FROM #enum
WHERE LogDate >= DATEADD(d,-1,GETDATE())
OPEN c1
WHILE 1=1
BEGIN
FETCH NEXT FROM c1 INTO @ArchiveNumber
IF @@FETCH_STATUS<>0
BREAK
INSERT INTO #ErrorLog
EXEC sp_readerrorlog @ArchiveNumber, 1, 'Login failed'
END
CLOSE c1
DEALLOCATE c1
SELECT DATEADD(mi,DATEDIFF(mi,GETDATE(),GETUTCDATE()),LogDate) AS LogDate, /* Convert to UTC based on current offset */
Text
FROM #ErrorLog
WHERE LogDate >= DATEADD(d,-1,GETDATE())
⚠️Note: @R4PH1 Makes a good point about the cost of this collection which will vary from instance to instance. There are things you can do like trace flag 3226 that will reduce the noise in your errorlog which could help with performance.
Next you need to configure the custom collection in the service config tool. Select the stored procedure and ensure the collection name is set to FailedLogins. Set an appropriate collection frequency - anything less than 1 day should work with the collection proc. If you have the Messaging feature enabled, you can trigger the report to run on demand outside it's scheduled collection.
The Get Script button will do a log of the hard work for you in creating the items in the repository database. In this case we need to customize it slightly as we don't want to store the full collection each time with a new snapshot date each time it runs. With a bit of customization we end up with this - including a basic custom report.
/*
----------------------------------------------------------
| ____ ____ _ ____ _ |
| | _ \ | __ ) / \ | _ \ __ _ ___ | |__ |
| | | | || _ \ / _ \ | | | | / _` |/ __|| '_ \ |
| | |_| || |_) |/ ___ \ | |_| || (_| |\__ \| | | | |
| |____/ |____//_/ \_\ |____/ \__,_||___/|_| |_| |
| |
| DBA Dash - Custom Collection Setup Script |
| Generated: 2024-10-15 15:39 |
| Version 3.12.0 |
----------------------------------------------------------
Script to setup FailedLogins custom collection
This script is designed to be used as a template to allow fast setup of custom data collections.
It is recommended that you review the script and make any changes required for your environment.
Run this script in your DBA Dash repository database
The collected data will be inserted into a table called UserData.FailedLogins with additional columns for InstanceID and SnapshotDate
The InstanceID column can be joined to dbo.Instances
!! WARNING: Consider the cost of running your custom data collection !!
*/
/***************************************************************************************************************************
* Set Retention *
***************************************************************************************************************************/
/*
How long should we keep the collected data?
Daily partitions will be created to make it efficient to clear out old data.
If retention is over 365 days, monthly partitions will be used instead
*/
DECLARE @RetentionDays INT = 60
EXEC dbo.DataRetention_Upd @SchemaName ='UserData', @TableName = 'FailedLogins',@RetentionDays=@RetentionDays, @Validate=0
/***************************************************************************************************************************
* Setup partitioning *
***************************************************************************************************************************/
/* Create partition function and scheme to make it efficient to clear out old data */
CREATE PARTITION FUNCTION [PF_UserData_FailedLogins](DATETIME2) AS RANGE RIGHT FOR VALUES()
CREATE PARTITION SCHEME [PS_UserData_FailedLogins] AS PARTITION [PF_UserData_FailedLogins] ALL TO([PRIMARY])
GO
/***************************************************************************************************************************
* Create table to store data *
***************************************************************************************************************************/
/* Create user defined type so we can pass the collected data to the stored procedure */
CREATE TYPE UserData.[FailedLogins] AS TABLE (
[LogDate] DATETIME2 NULL,
[Text] NVARCHAR(MAX) NULL
);
GO
/* Create table to store the collected data */
GO
CREATE TABLE UserData.[FailedLogins] (
[InstanceID] INT NOT NULL,
[LogDate] DATETIME2 NULL,
[Text] NVARCHAR(MAX) NULL
/* Warning: Script just creates a clustered index on InstanceID and SnapshotDate. Consider replacing this, adding a primary key and other indexes if required */
INDEX IX_UserData_FailedLogins CLUSTERED(InstanceID,LogDate)
) ON [PS_UserData_FailedLogins](LogDate)
WITH(DATA_COMPRESSION=PAGE);
GO
/***************************************************************************************************************************
* Create procedure for import *
***************************************************************************************************************************/
GO
CREATE OR ALTER PROCEDURE UserData.[FailedLogins_Upd]
(
/* InstanceID value from dbo.Instances table (Unique ID associated with the instance) */
@InstanceID INT,
/*
@SnapshotDate represents the time of the data collection in UTC.
It's the time all the collections within the same batch (scheduled time) started.
You can have your query return 'SYSUTCDATETIME() AS SnapshotDate' if you need higher accuracy
*/
@SnapshotDate DATETIME2,
/* Table-valued parameter with the contents of our custom data collection */
@FailedLogins [FailedLogins] READONLY
)
AS
SET XACT_ABORT ON
SET NOCOUNT ON
DECLARE @MaxLogDate DATETIME2
SELECT @MaxLogDate = ISNULL(MAX(LogDate),'19000101')
FROM UserData.[FailedLogins]
WHERE InstanceID = @InstanceID
INSERT INTO UserData.[FailedLogins]
(
[InstanceID],
[LogDate],
[Text]
)
SELECT
@InstanceID AS InstanceID,
[LogDate],
[Text]
FROM @FailedLogins
WHERE LogDate > @MaxLogDate
/* Log the data collection */
EXEC dbo.CollectionDates_Upd @InstanceID = @InstanceID,
@Reference = 'UserData.FailedLogins',
@SnapshotDate = @SnapshotDate
GO
/***************************************************************************************************************************
* Custom Reports *
***************************************************************************************************************************/
/* https://dbadash.com/docs/how-to/create-custom-reports/ */
GO
/*
Failed Logins
Custom report for DBA Dash.
http://dbadash.com
Generated: 2024-10-15 16:05:02
*/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE OR ALTER PROC [UserReport].[FailedLogins]
(
/* Table-valued parameter passing a list of InstanceIDs (dbo.Instance table) associated with the current context in DBA Dash */
@InstanceIDs IDs READONLY,
/* Optional @SnapshotDate parameter to show the data associated with a specific snapshot. Otherwise the last snapshot is shown. UTC */
@FromDate DATETIME2,
@ToDate DATETIME2
)
AS
SELECT I.InstanceDisplayName AS Instance,
UD.[LogDate],
UD.[Text]
FROM UserData.FailedLogins UD
JOIN dbo.Instances I ON I.InstanceID = UD.InstanceID
WHERE EXISTS(SELECT 1
FROM @InstanceIDs T
WHERE T.ID = UD.InstanceID
)
AND LogDate >= @FromDate
AND LogDate < @ToDate
GO
/* Report customizations in GUI */
DELETE dbo.CustomReport
WHERE SchemaName = 'UserReport'
AND ProcedureName = 'FailedLogins'
INSERT INTO dbo.CustomReport(SchemaName,ProcedureName,MetaData)
VALUES('UserReport','FailedLogins','{
"ReportVisibilityRole": "public",
"ReportName": "Failed Logins",
"TriggerCollectionTypes": [
"UserData.FailedLogins"
],
"CustomReportResults": {
"0": {
"ColumnAlias": {
"LogDate": "Log Date"
},
"CellFormatString": {},
"DoNotConvertToLocalTimeZone": [],
"ColumnLayout": [],
"ResultName": "Result0",
"LinkColumns": {},
"CellHighlightingRules": {}
}
}
}')
/***************************************************************************************************************************
* Create initial partitions *
***************************************************************************************************************************/
EXEC dbo.Partitions_Add
/***************************************************************************************************************************
* Cleanup script to remove objects created *
****************************************************************************************************************************
IF EXISTS (SELECT * FROM sys.procedures WHERE Name = N'FailedLogins_Upd' AND schema_id = SCHEMA_ID('UserData'))
BEGIN
DROP PROC UserData.[FailedLogins_Upd]
END
IF EXISTS (SELECT * FROM sys.types WHERE is_user_defined = 1 AND name = N'FailedLogins' AND schema_id = SCHEMA_ID('UserData'))
BEGIN
DROP TYPE UserData.[FailedLogins]
END
IF EXISTS (SELECT * FROM sys.tables WHERE Name = N'FailedLogins' AND schema_id = SCHEMA_ID('UserData'))
BEGIN
DROP TABLE UserData.[FailedLogins]
END
IF EXISTS (SELECT * FROM sys.procedures WHERE Name = N'FailedLogins Example' AND schema_id = SCHEMA_ID('UserReport'))
BEGIN
DROP PROC [UserReport].[FailedLogins]
END
DELETE dbo.CustomReport WHERE SchemaName = 'UserReport' AND ProcedureName IN('FailedLogins Snapshots','FailedLogins Example')
IF EXISTS (SELECT * FROM sys.partition_schemes WHERE name = N'PS_UserData_FailedLogins')
BEGIN
DROP PARTITION SCHEME [PS_UserData_FailedLogins]
END
IF EXISTS (SELECT * FROM sys.partition_functions WHERE name = N'PF_UserData_FailedLogins')
BEGIN
DROP PARTITION FUNCTION [PF_UserData_FailedLogins]
END
DELETE dbo.DataRetention WHERE SchemaName = 'UserData' AND TableName = 'FailedLogins'
****************************************************************************************************************************/
Adding Failed login report to DBADASH out of box reporting.