Closed ncrothe closed 6 years ago
Looking at the code, I am aware I can work around this by making sure to set the json event fields ddsource, ddsourcecategory and ddtags in e.g. a record-transformer filter, but would appreciate a more direct solution.
Hello @ncrothe,
Thanks for reaching out. It would have indeed be my first suggestion to use a record transformer to add the parameters directly to your json events as follows:
<match **>
@type record_modifier
ddsource "#${tag_parts[0]}"
...
</match>
I understand that it seems like a nice feature to handle such behaviour in our plugin as well. That said, as it is already natively handled by fluentD record-transformer plugin, would it be acceptable for you to use this extra plugin until we improve our plugin to support this feature?
Thanks
Yes, definitely. It's what we're doing now and we got it working accordingly.
Treat it as a feature request then.
As this is a feature request, would it be possible to send it to support@datadoghq.com (you can reference this issue in your email) and close this issue?
This would ensure that you are notified as soon as there is an update on it.
Done
If anyone else stumbles on this page while trying to figure out how to do this with fluentbit, this worked for me:
First, remove dd_source
from the [OUTPUT]
section. Then you can do like:
[FILTER]
Name record_modifier
Match api-firelens*
Record ddsource python
[FILTER]
Name record_modifier
Match nginx-firelens*
Record ddsource nginx
Note the record name needs to be ddsource
(no _
). If you're using aws+ecs+firelens, the Match
pattern is containername-firelens
(documented here and here).
You'll now see the expected sources in datadog:
Hope this helps someone 👨🏻💻
Describe what happened: I tried to forward log entries from a central processor and set source/service on the dd event based on dynamic fields or even tag elements. E.g. using the following config:
I understand the
tag_parts
is specific to the record_transformer, but I also can't reference any record fields, because the code takes the parameter values verbatim and puts them into new record fields:And in the Datadog Log Explorer I end up with
(or similar if I refer actual fields)
Describe what you expected: Some way of using dynamic values in specifying dd_source, dd_tags and dd_sourcecategory. Maybe extending the
tag_key
mechanism might be powerful enough. I.e. have eitherdd_source
with a static value ordd_source_field
with a field reference.Ideally I'd also be able to use tag parts, but I can work around it.
Steps to reproduce the issue: Create any source in fluentd and configure the datadog plugin to forward using field references.
Additional environment details (Operating System, Cloud provider, etc): Ubuntu running on AWS. The datadog plugin runs on a central processor which gets events from source instances which capture the actual log events.