billgraziano / xelogstash

Send SQL Server Extended Events to Logstash, Elastic Search, or JSON
Other
26 stars 11 forks source link

No query logging? #92

Closed fribse closed 1 year ago

fribse commented 1 year ago

Hi @billgraziano I used yesterday to get this working, and now I'm getting data. But the two interesting parts, rpc_completed and sql_batch_completed doesn't show up, which are the ones I need, any input on how to solve that?

I have this that drops them into a file, but that takes up a lot of space, and I have no idea to get it from that file to xelogstash:


CREATE EVENT SESSION [AuditLog] ON SERVER

ADD EVENT sqlserver.rpc_completed(SET collect_statement=(1)
ACTION(sqlserver.database_name,sqlserver.sql_text,sqlserver.username)),
ADD EVENT sqlserver.sql_batch_completed(
ACTION(sqlserver.database_name,sqlserver.sql_text,sqlserver.username))
--Set location and names for where the files should be placed on local server. Should NOT be on the drive where the database-files is!
ADD TARGET package0.event_file(SET filename= N'F:\AUDIT\sqlserver.xel' ,max_file_size=(1024),max_rollover_files=(10))
WITH (MAX_MEMORY=4096 KB,EVENT_RETENTION_MODE=ALLOW_SINGLE_EVENT_LOSS,MAX_DISPATCH_LATENCY=30 SECONDS,MAX_EVENT_SIZE=0 KB,MEMORY_PARTITION_MODE=NONE,TRACK_CAUSALITY=ON,STARTUP_STATE=ON)
GO

--And finally run this command
ALTER EVENT SESSION AuditLog ON SERVER STATE = START;
GO
fribse commented 1 year ago

Ok, I got the statements sent to elastic, by adding AuditLog to the 'sessions' statement. I hope it can handle the 'rollover' statement?

billgraziano commented 1 year ago

It should handle rollovers just fine. I've been using this way for years. Let me know if you have any issues.

fribse commented 1 year ago

Ok, great, it looks like it's working. I'm trying to create a template for elasticseach so I can have a lifecycle policy on it. But I can't get kibana to accept the template. I've read out the mappings from the log, but it fails. The mappings looks like this:

{
  "mappings": {
    "SQLExtendedLogs": {
      "properties": {
        "@timestamp": {
          "type": "date"
        },
        "@version": {
          "type": "text",
          "fields": {
            "keyword": {
              "type": "keyword",
              "ignore_above": 256
            }
          }
        },
        "attach_activity_id": {
          "type": "text",
          "fields": {
            "keyword": {
              "type": "keyword",
              "ignore_above": 256
            }
          }
        },
        "attach_activity_id_xfer": {
          "type": "text",
          "fields": {
            "keyword": {
              "type": "keyword",
              "ignore_above": 256
            }
          }
        },
        "batch_text": {
          "type": "text",
          "fields": {
            "keyword": {
              "type": "keyword",
              "ignore_above": 256
            }
          }
        },
        "callstack": {
          "type": "text",
          "fields": {
            "keyword": {
              "type": "keyword",
              "ignore_above": 256
            }
          }
        },
        "category": {
          "type": "text",
          "fields": {
            "keyword": {
              "type": "keyword",
              "ignore_above": 256
            }
          }
        },
...
        "statement": {
          "type": "text",
          "fields": {
            "keyword": {
              "type": "keyword",
              "ignore_above": 256
            }
          }
        },
...
        "xml_report": {
          "type": "text",
          "fields": {
            "keyword": {
              "type": "keyword",
              "ignore_above": 256
            }
          }
        }
      }
    }
  }
}

But when I copy it to a template, it fails. I see this error, and I'm not that experienced with complex templates

{
  "statusCode": 400,
  "error": "Bad Request",
  "message": "composable template [simulate_template_vbccofdyt1meytx4d3sfrq] template after composition is invalid",
  "attributes": {
    "error": {
      "root_cause": [
        {
          "type": "illegal_argument_exception",
          "reason": "composable template [simulate_template_vbccofdyt1meytx4d3sfrq] template after composition is invalid"
        }
      ],
      "type": "illegal_argument_exception",
      "reason": "composable template [simulate_template_vbccofdyt1meytx4d3sfrq] template after composition is invalid",
      "caused_by": {
        "type": "illegal_argument_exception",
        "reason": "invalid composite mappings for [simulate_template_vbccofdyt1meytx4d3sfrq]",
        "caused_by": {
          "type": "mapper_parsing_exception",
          "reason": "Failed to parse mapping [_doc]: Root mapping definition has unsupported parameters:  [SQLExtendedLogs : {dynamic_templates=[], properties={data={type=text, fields={keyword={ignore_above=256, type=keyword}}}, xe_category={type=text, fields={keyword={ignore_above=256, type=keyword}}}, logical_reads={type=long}, opcode={type=text, fields={keyword={ignore_above=256, type=keyword}}}, type={type=text, fields={keyword={ignore_above=256, type=keyword}}}, mssql_domain={type=text, fields={keyword={ignore_above=256, type=keyword}}}, row_count={type=long}, user_defined={type=boolean}, xml_deadlock_report={type=text, fields={keyword=
...

Any idea on how to do fix that?

fribse commented 1 year ago

I got it fixed, after tinkering with it for a day :-) At the top I removed

  "mappings": {
    "SQLExtendedLogs": {

And added "dynamic_templates": [], Above "properties" and removed the corresponding curly brackets at the bottom. I also added a setting for replicas. Now it created it. Now how do I make it re-read all the existing logfiles, so I can get the template applied? If I just remove the 'mark' files, it doesn't seem to read all the old logfiles (that are rolled over) does it?

fribse commented 1 year ago

Damned, when I create the template it looks like it's rejecting the input, and I only get a 452 byte index. As soon as I remove the template, it flows...

billgraziano commented 1 year ago

I'm afraid my skills inside ELK or Logstash are very limited. I don't have any idea on these.