Closed PrabuddhaRaj closed 4 years ago
Hi!
I think there may be some confusion on the source of those logs.
Can you confirm the sourcetype
of the event you are trying to exclude?
by the way, you should delete " (to exclude kube-system namespace)" this portion. regarding annotation, you can add the annotation to you deployment.yaml file.
by the way, you should delete " (to exclude kube-system namespace)" this portion. regarding annotation, you can add the annotation to you deployment.yaml file.
thanks @rockb1017 actually i resolved that issue by removing from the configMap.yaml, @include source.files.conf
but i can try what you have suggested. secondly i am getting
i want to delete these logs from my application container which is marketing mock. hence within my container there are all these unwanted logs. any suggestion on how to remove these logs?
@rockb1017 : i was able to resolve this by using grep plugin and getting only application access logs to splunk. But i want to convert the log messages in text format to JSON. i tried using the below code with formatter json plugin <match tail.containers.*> @type file path /var/log/containers/marketing-mock*.log format> @type json /format> But this doesnt work. any way to convert in json the application logs in fluentd. i used the parser json plugin too
based on your screenshot, it seems it is already json?
the event is just a string. [INFO] [API Access Log]: [timestamp] "GET http/1.1" "/api.........
what plugin are you trying to use in this block?
<match tail.containers.**>
@type file
path /var/log/containers/marketing-mock.log
format>
@type json
/format>
yeah I'm not sure I am following what you are configuring and worry we may be struggling with something we already solved for..
Are you a current customer? Maybe a zoom with your account team will help?
You can tell your SE to reach out to me and we can get this sorted, or join the slack chat splk.it/slack
based on your screenshot, it seems it is already json? the event is just a string.
[INFO] [API Access Log]: [timestamp] "GET http/1.1" "/api.........
what plugin are you trying to use in this block?<match tail.containers.**> @type file path /var/log/containers/marketing-mock.log format> @type json /format>
@rockb1017 : actually i am doing this in output.conf but i think you are correct as i dont have any key attached to the initial [INFO] API access log and hence i cannot convert it into JSON the way i want it. the parser might need a key value pair. my use case is my comany has various LOB's where each lOB is logging in a different format. so i need a standardized way to convert it into JSON from fluentd and hence make some meaningful insights from logs. the order of entities in a log might be different or key for one log can be applicationID but another one could have appid.Some lobs's are sending in json which i am easily able to consume in splunk. what is the best solution to implement this ? i think i can enhance the logs using tagging or parsing it in such a way that i can add key value pairs and hence make it easily consumable in splunk.
yeah I'm not sure I am following what you are configuring and worry we may be struggling with something we already solved for..
Are you a current customer? Maybe a zoom with your account team will help?
You can tell your SE to reach out to me and we can get this sorted, or join the slack chat splk.it/slack
@matthewmodestino please drop me any invite on my email prabuddha.raj@sap.com and we can proceed ahead
based on your screenshot, it seems it is already json? the event is just a string.
[INFO] [API Access Log]: [timestamp] "GET http/1.1" "/api.........
what plugin are you trying to use in this block?<match tail.containers.**> @type file path /var/log/containers/marketing-mock.log format> @type json /format>
@rockb1017 : actually i am doing this in output.conf but i think you are correct as i dont have any key attached to the initial [INFO] API access log and hence i cannot convert it into JSON the way i want it. the parser might need a key value pair. my use case is my comany has various LOB's where each lOB is logging in a different format. so i need a standardized way to convert it into JSON from fluentd and hence make some meaningful insights from logs. the order of entities in a log might be different or key for one log can be applicationID but another one could have appid.Some lobs's are sending in json which i am easily able to consume in splunk. what is the best solution to implement this ? i think i can enhance the logs using tagging or parsing it in such a way that i can add key value pairs and hence make it easily consumable in splunk.
hi @rockb1017 , is there a way to just fetch some key value pairs from the log and pass it as json via the record_transformer plugin. <filter tail.containers.**> @type record_transformer enable_ruby
you can use regex parsing to extract what you need. Please refer this fluentd document. https://docs.fluentd.org/filter/parser Thank you and i will close this!
What happened: I wanted to exclude the logs of my sidecar pod running on my kubernetes node but even after excluding it from path, i see the logs in splunk. Secondly: i want to receive only application logs from c4c-mock pod which is my main application. below is my values.yaml file code fluentd:
path: /var/log/containers/analytics.log, /var/log/containers/c4c-mock.log, /var/log/containers/commerce-mock.log, /var/log/containers/marketing-mock.log
exclude_path:
What you expected to happen: if you see below is my configMap deployed and it has the exclude path but my splunk still has logs from that pod.
How to reproduce it (as minimally and precisely as possible):
Anything else we need to know?: below is the log section of values.yaml file that i am using.
Environment: