IBM / IBM-QRadar-Universal-Cloud-REST-API

These workflows are provided for sample usage, new submissions and updates from the community, and are NOT supported by IBM.
46 stars 93 forks source link

Not a JSON Array, timestamps in the payloads weeks old #57

Closed matt-pb closed 3 years ago

matt-pb commented 3 years ago

I have constructed an xml workflow for NS1 DNS. I am getting what appears to be real-time logs in json format. However, the timestamps in the payloads (in epoch time) do not even come close to the Start time of the event as it shows in QRadar. We have confirmed with the end users that the time in the payload is correct and not the Start time in QRadar.

I am following this for command line troubleshooting:

https://www.ibm.com/docs/en/dsm?topic=protocol-command-line-testing-tool

As we are using version 1, I have properly configured my command and am getting this output:

[ workflow]# java -cp "/opt/ibm/si/services/ecs-ec-ingress/current/bin/:/opt/ibm/si/services/ecs-ec-ingress/eventgnosis/lib/q1labs/" com.q1labs.semsources.sources.universalcloudrestapi.UniversalCloudRESTAPITest -wp Default-Workflow-Parameter-Values.xml -w qroc1.xml

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in [jar:file:/opt/ibm/si/services/ecs-ec-ingress/2020.7.3.20210323172312/bin/slf4j-log4j12-1.7.13.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in [jar:file:/opt/ibm/si/services/ecs-ec-ingress/2020.7.3.20210323172312/bin/slf4j-simple-1.7.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

2021-08-13 16:22:39 [INFO ][UniversalCloudRESTAPITest] Status changed to ERROR: Not a JSON Array: "<!doctype html>\n \n \n <meta http-equiv=\"content-type\" content=\"text/html; charset=UTF8\">\n NS1 | Customer Portal\n <meta name=\"google\" content=\"notranslate\" />\n <script src=\"/cdn-cgi/apps/head/BGkBeDlUJpHx3swQRyf58HOAico.js\"><link rel=\"shortcut icon\" type=\"image/ico\" href=\"assets/favicon.ico\">\n \n <link href=\"static/iconfont.css?v=<%= timestamp %>\" rel=\"stylesheet\" />\n \n \n\n <div id=\"app-body\">

\n <div id=\"error-modal\">
\n\n <script id=\"ga-script-holder\">\n\n \n <script src=\"ns1.js?v=1628794125638\">\n \n \n "

Again, I pass the tests and get real time logs in json format. I just can't built use cases off of them as the time stamps are not correct.

I can provide the workflow.xml if necessary.

Thank you,

-Matt

Fa6s commented 3 years ago

Hey Matt, checking the xml code would be nice. If a timestamp is weeks old in the payload....then that is a source problem from where you pull the events from? Other thing I could imagine is the data format is not correctly set in QRadar for these logs in the DSM Editor? The QRadar start time has no relationship whatsoever with the payload content. Starttime is the time when an event gets processed in QRadar. What should be the same time as in the payload is the "LogSourceTime"

ChrisCollinsIBM commented 3 years ago

Collecting events and parsing events are 2 completely separate processes. If the events are getting into QRadar as expected then I don't think it's an issue with the workflow rather with the Timestamp parsing in the Custom DSM.

If you could post a sanitized event sample with even just the relevant section to timestamp parsing and what you're using for an expression in the DSM editor we might be able to help out.

matt-pb commented 3 years ago

Hello!

Thanks for the reply!

We received the output of all events in the last week from NS1 as they see them. We had less then 10 events. This log source has an EPS of over 1.

It appears that the Workflow continually loops with or without a DoWhile statement. I see the same payloads with the same meta data looping over and over again as a raw log. This does not appear to be a parsing issue.

I have also used the command line testing tools and watch the same activity before it is processed and parsed.

NS1 offers a way to hard code the start and end request period in the GET statement. I figured out the syntax for XML, however, both the start and end times would have to be a dynamic variable coded in the workflow for it to pull properly. Perhaps I’m just not seeing it, but I don’t see any other Workflows on this site that use that logic.

Here is the Workflow: (Note: I have tried it with and without the DoWhile & if statements. It appears to loop without it judging from the cmd line output)

<?xml version="1.0" encoding="UTF-8" ?>

Thank you,

-Matt

Fa6s commented 3 years ago

Hey Matt, i've just wrote such a workflow with dynamic start and end times: it is a pull request currently

The magic happens here: First if the workflow never ran before I set a start time which is today minus 24 hours. Format the date to whatever your API needs - I needed "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"

<!-- set a startTime (today - 24 hours (86400000 ns) if the workflow runs for the first time on a system, if a startDateTime from a previous run already exists, skip and continue -->
        <If condition="empty(/startDateTime)" >
            <!-- Initialize the first startdatetime if not yet initialized with date which is current timestamp minus 24 hours (minus 864000000)  -->
            <Initialize path="/startDateTime" value="${time() - 86400000}" />
            <!-- Format date to work with the API -->
            <FormatDate pattern="yyyy-MM-dd'T'HH:mm:ss.SSS'Z'" timeZone="UTC" time="${/startDateTime}" savePath="/startDateTime" />
        </If>

After that you want to set the endtime to the current time:

<!-- Format date to work with the Trend Micro Vision One API's date format -->
        <FormatDate pattern="yyyy-MM-dd'T'HH:mm:ss.SSS'Z'" timeZone="UTC" time="${time()}" savePath="/endDateTime" />

Then I start my first GET request with the start and enddatetime:

            <CallEndpoint url="https://${/hostURL}/v2.0/siem/events" method="GET" savePath="/get_events_alertList/response">
                <BearerAuthentication token="${/baerer_token}"/>
                <QueryParameter name="startDateTime" value="${/startDateTime}" />
                <QueryParameter name="endDateTime" value="${/endDateTime}" />
                <RequestHeader name="contentType" value="'Content-Type': 'application/json;charset=utf-8'" />
            </CallEndpoint>

Then after the workflow ran successfully I set the startdatetime with the current enddatetime so that for whenever the next run is, the startdate is fixed with the date of the last successful run (this is because the value remains in the "state" until changed thus you could suffer a QRadar outage for three days and after that (if the "state" remained untouched) your next pull will start with time the workflow ran the last time before the outage):

<!-- Set new startDateTime as last run EndDateTime after successful run to save the last run end for the next run as new startDateTime  -->
        <Set path="/startDateTime" value="${/endDateTime}" />

let me know if anything is not clear to you so I can explain to you in detail

ChrisCollinsIBM commented 3 years ago

Seems like this question has been addressed and some feedback given, please re-open this issue if further assistance is required.