Open yarinm opened 6 years ago
Hi @yarinm - i agree. For most of our connectors that are sending things to Splunk via HEC we add support for both endpoints.
This feature would be greatly appreciated! The JSON endpoint for the HEC doesn't do timestamp extraction so we're unable to post-process the timestamp via props.conf. So our events (coming in through ingested CSV files) all reflect ingest time instead of the actual occurrence time (contained in a field within the csv lines). Splunk allows timestamp extraction from the raw endpoint.
@SVPA-LenPistoria you should take a look at my PR https://github.com/splunk/fluent-plugin-splunk-hec/pull/18
It adds the ability to define a field in the record that will be used as the event time when sending this to HEC
@yarinm This would be an even-better solution to my woes! Let's hope your PR gets merged soon! Thanks!
+1 on sending via the raw endpoint. I'm keen to use Splunk to do the timestamp extraction rather than trying to do it within fluent.
From my point of view, line breaking/merging should be done on Splunk side. I created prototype and there is also this branch, both adding an option for using the RAW endpoint instead of the EVENT endpoint. I tested it and it worked for me. Many Splunk customers are asking for this. Anyone interested in working on a PR?
Are there any news on this Issue? we are looking forward to use the raw endpoint. Right now we are not able to get a "java-stracktrace" as a single event into splunk.
I'm really interested to add the possibility to send via the raw endpoint too. Is there any news?
Is there any plan to implement this feature?
any update on this?
It could be useful to add support for the /raw api of HEC. When using the /event endpoint some rules (like LINEMERGE/BREAK) don't work and they only work when batching the logs in the raw endpoint.