Closed dikshachauhan-qasource closed 3 years ago
Pinging @elastic/integrations (Team:Integrations)
Thanks for putting this in Diksha. i'll summarize how I see this:
if it isn't clear what to fill in here, I think we can update the docs in the in-line text in the Integration. But it seems pretty good to me... I think if this data is intended only to be consumed from Splunk API we could put that into the larger text? Like 'Collect Logs from Splunk REST API (experimental)' instead of citing it is vaguely '3rd party' ? @fearful-symmetry what do you think? It is not a blocker concern, mind you. What do you think?
the question of what is available at any moment in time for the test team to validate is a separate concern we can follow up on. Let me start an email thread with more of the Engineering Productivity Team.
for now, and the short term it may be faster for the dev team to take / confirm this data has been seen working on Windows and/or other hosts. Alex, I don't know if that is for you or maybe @leehinman to answer?
So, that was added by @marc-gr as part of a splunk input. Alas, I don't have any splunk experience, so I can't say much more.
@EricDavisX I will prepare a local environment to validate it works and add the screenshots here
I checked again setting up a new env, and it is working correctly with the experimental input
hope that helps!
Thanks Marc, it does help!
@dikshachauhan-qasource I will recommend we leave this testing for the Engineering team to manage for now. But before we do that I wish to confirm with the team if this portion of the Integration is not expected to change much? Also, any quick discussion on how much this feature set is covered with your team's automation systems?
If nothing else comes, we will expect you can manage any subsequent test needs @marc-gr. And of course, I am confirming this detail cross-team because it is part of the System Intgration and therefore the Default policy. But I recognize this isn't turned on itself since it requires more info to connect, I think it is ok if we explicitly don't cover it on the manual side, unless a detail comes up that it is warranted or desired. Let us know and keep in touch.
@EricDavisX
Thanks for the confirmation for now and keeping it out of scope of our testing.
@marc-gr However, if we ever need to test this in future, we will be requiring related docs/steps so that we can validate it accordingly.
Thanks QAS
I spoke briefly with @dikshachauhan-qasource this morning and if we were more familiar with Splunk it might be clearer. In re-reviewing the image she attached I think we have all the info we need to test, if we ever wish to. Diksha, the Endgame Splunk server should be available if we want to try it out, I can fwd you the creds if you don't have them.
The UI in the screenshot says scheme://IP:Port and that the endpoint is inferred automatically. So it seems clear enough to me now.
With this I'm closing this out and we can chat off line.
Hi @EricDavisX
We have re-attempted this on released 7.13 Kibana cloud and self managed 7.13 Kibana builds. We used the Endgame splunk credentials to validate this. However, we are unable to get the desired results.
Further, as discussed on slack we will continue with System third party rest API tests when it will be available in a stable position.
Thanks QAS
While performing testing for system Package 12.3 on 7.13Snapshot build, we have observed that new logs collection sub-tab is added to system integration related to Rest API. Screenshot is provided here for reference:
Query It would be very helpful if anybody could let us know to validate these events. Kindly share document links or PRs if available.