Open pattisapu01 opened 5 years ago
@teeboy75 thanks for submitting this issue
First, it was not clear in the documentation, but the logLevel
property of the Log4net input indicates minimum level for captured events. So if you want to capture all events, with all levels, you need to declare the Log4net input just once, with logLevel
equal to Debug
, as it is the lowest possible level. I have just updated the docs to clarify this.
Also, I would suggest to use StdOutput to determine whether the problem is that the Log4net is not capturing the NDC context data, or is it a problem with Elasticsearch output. Can you please do another experiment and:
and let us know if that causes NDC data to show up? Thx!
Added stdout and removed additional log levels from eventflowconfig. Does not make any difference. NDC output does not show up on console window and ELK. The only additional Json enriched fields that seems to be sent to elastic are "timestamp", "providernam","level" and "keywords". Ofcourse, "payload" element contains just the text "Hey! Listen!". Here is the json. { "_index": "index-2018.11.15", "_type": "diagData", "_id": "P0PeGGcBITRnpnnqT_iE", "_version": 1, "_score": null, "_source": { "timestamp": "2018-11-15T14:34:38.603181-05:00", "providerName": "Log4netInput.MY_LOGGER_NAME", "level": 5, "keywords": 0, "payload": { "Message": "Hey! Listen!", "Exception": { "ClassName": "System.Exception", "Message": "uhoh", "Data": null, "InnerException": null, "HelpURL": null, "StackTraceString": null, "RemoteStackTraceString": null, "RemoteStackIndex": 0, "ExceptionMethod": null, "HResult": -2146233088, "Source": null, "WatsonBuckets": null } } }, "fields": { "timestamp": [ "2018-11-15T19:34:38.603Z" ] }, "sort": [ 1542310478603 ] }
Just to be clear, if I remove eventflow input and output and directly use log4nets log appenders, the NDC comment works..
OK, thanks. I looked a bit more closely into this and it looks like appending data from Log4net global context & logical thread context was never implemented.
For comparison here is how Stackify Retrace appender does this
@teeboy75 would you be willing to submit a PR to have these properties added?
CC @jeremysmith1
Sure! I can do it late next week..
Thank you! Much appreciate your help @teeboy75
@karolz-ms I have the code ready. Waiting from corporate to give me permission for PR submission. will update you.
@teeboy75 Roger!
@pattisapu01 Do you mind if we (myself or some other Microsoft developer) "steal" the code from you (https://github.com/pattisapu01/diagnostics-eventflow/commit/c9c4d65cc20cd2ce61724413f35cb9c0de6f3741) and finish this PR? I'd be happy if your time and work on this issue did not go to waste...
Absolutely! Unfortunately, the company I work for is not moving quickly. Thank you for getting back to me. Regards Prakash
Get Outlook for Androidhttps://aka.ms/ghei36
From: Karol Zadora-Przylecki notifications@github.com Sent: Tuesday, March 26, 2019 11:59:40 AM To: Azure/diagnostics-eventflow Cc: pattisapu01; Mention Subject: Re: [Azure/diagnostics-eventflow] Eventflow output to ELK with log4net input and NDC stack (#287)
@pattisapu01https://github.com/pattisapu01 Do you mind if we (myself or some other Microsoft developer) "steal" the code from you (pattisapu01@c9c4d65https://github.com/pattisapu01/diagnostics-eventflow/commit/c9c4d65cc20cd2ce61724413f35cb9c0de6f3741) and finish this PR? I'd be happy if your time and work on this issue did not go to waste...
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHubhttps://github.com/Azure/diagnostics-eventflow/issues/287#issuecomment-476718616, or mute the threadhttps://github.com/notifications/unsubscribe-auth/AUPozVWDp5GD4-t7Z3aPqccaRsXcT0Lvks5vakPsgaJpZM4YfvOB.
I have a simple eventflow setup writing to elasticsearch with log4net input
eventflowconfig.json
"inputs": [
{ "type": "Log4net", "logLevel": "Debug" }, { "type": "Log4net", "logLevel": "Info" }, { "type": "Log4net", "logLevel": "Warn" }, { "type": "Log4net", "logLevel": "Error" }, { "type": "Log4net", "logLevel": "Fatal" } ],
"outputs": [ { "type": "ElasticSearch", "indexNamePrefix": "defaultindex", "serviceUri": "http://servername:9200", "basicAuthenticationUserName": "user", "basicAuthenticationUserPassword": "password", "eventDocumentTypeName": "diagData", "numberOfShards": 1, "numberOfReplicas": 0, "refreshInterval": "5s"
}
], "settings": { "pipelineBufferSize": "1000", "maxEventBatchSize": "100", "maxBatchDelayMsec": "500", "maxConcurrency": "8", "pipelineCompletionTimeoutMsec": "30000" }, "extensions": [] }
The issue is, the custom data [12345 in this case] I write to the NDC context, is not written to elasticsearch. How do I configure elastic output to recognize any custom data pushed to NDC? The debug message along with the "exception" message does get written to ELK