swiftbird07 / IRIS-SOAR

🚀 IRIS-SOAR: Modular SOAR (Security Orchestration, Automation, and Response) implementation in Python. Designed to complement DFIR-IRIS through playbook automation and seamless integrations. Easily extensible and in active development. Join us in building a tool geared towards enhancing security efficiency!
MIT License
6 stars 0 forks source link

Graylog Integration #1

Open shaker402 opened 7 months ago

shaker402 commented 7 months ago

I try to integrate IRIS-SOAR with wazuh-indexer 4.4 , but I got the follwing errors:

root@IRIS:/IRIS-SOAR# python3 iris-soar.py --restart 2024-03-12 21:49:59,942 - isoar - INFO - Restarting IRIS-SOAR... 2024-03-12 21:49:59,942 - isoar - INFO - Stopping IRIS-SOAR... 2024-03-12 21:49:59,964 - isoar - INFO - Daemon not running 2024-03-12 21:49:59,987 - isoar - INFO - Worker script not running 2024-03-12 21:49:59,988 - isoar - WARNING - Nothing to stop! 2024-03-12 21:49:59,998 - isoar - INFO - Daemon disabled. Starting the main loop (isoar_worker.py) directly... 2024-03-12 21:50:00,053 - isoar_collector - INFO - Started IRIS-SOAR collector script 2024-03-12 21:50:00,053 - isoar_collector - INFO - Checking for new alerts... 2024-03-12 21:50:00,151 - isoar_collector - INFO - Calling module elastic_siem /usr/local/lib/python3.9/dist-packages/elasticsearch/_sync/client/init.py:399: SecurityWarning: Connecting to 'https://192.168.59.128:9200' using TLS with verify_certs=False is insecure _transport = transport_class( 2024-03-12 21:50:00,251 - isoar_collector - WARNING - The module elastic_siem had an unhandled error when trying to provide new alerts. Error: Traceback (most recent call last): File "/root/IRIS-SOAR/isoar_alert_collector.py", line 130, in main new_alerts = module_import.irsoar_provide_new_alerts(integration_config) File "/root/IRIS-SOAR/integrations/elastic_siem.py", line 1256, in irsoar_provide_new_alerts result = elastic_client.search( File "/usr/local/lib/python3.9/dist-packages/elasticsearch/_sync/client/utils.py", line 446, in wrapped return api(*args, **kwargs) File "/usr/local/lib/python3.9/dist-packages/elasticsearch/_sync/client/init.py", line 3836, in search return self.perform_request( # type: ignore[return-value] File "/usr/local/lib/python3.9/dist-packages/elasticsearch/_sync/client/_base.py", line 320, in perform_request raise HTTP_EXCEPTIONS.get(meta.status, ApiError)( elasticsearch.ApiError: ApiError(406, 'Content-Type header [application/vnd.elasticsearch+json; compatible-with=8] is not supported', 'Content-Type header [application/vnd.elasticsearch+json; compatible-with=8] is not supported') . Skipping Integration. 2024-03-12 21:50:00,251 - isoar_collector - WARNING - The module ibm_qradar is disabled. Skipping. 2024-03-12 21:50:00,251 - isoar_collector - WARNING - The module matrix_notify is disabled. Skipping. 2024-03-12 21:50:00,252 - isoar_collector - INFO - Finished collector script. root@IRIS:~/IRIS-SOAR#

Is the error becose the compatability with wazuh-indexer 4 , kindly , your support ,please ,

swiftbird07 commented 7 months ago

Hey as I see you are using the Elasticsearch integration. This integration is meant to be used with Elastic SIEM / Elastic Security to fetch alerts from there (and get EDR context etc.). Wazuh indexer also uses Elasticsearch under the hood but it’s not the same as Elastic SIEM as it eg. handles alerts differently.

I am using Wazuh myself but in combination with Elastic SIEM where I ship all Wazuh alerts and events directly to Elastic SIEM via FileBeat. I can help you with that if you want. Otherwise you might want to edit either the Elasticsearch integration to fit the need of Wazuh indexer API or better; make a new integration wazuh.py and fetch alerts directly via the well documented Wazuh-API. I could help you with the latter if you want.

Greetings Martin

shaker402 commented 7 months ago

thank you ,Daer @swiftbird07 for your wonderful solutions , in truth I used the Worlds Best SOC Built on Open Source Tools series (https://socfortress.medium.com/build-your-own-siem-stack-with-open-source-tools-series-39da0f2d412a)

the wazuh-indexer is the backed logs storage and , the Graylog is the logs ingestion witch recive logs from wazuh-manager and store them in indexed of wazuh-indexer but with different format ,

for that I think the best solution is to make a new integration wazuh.py and fetch alerts directly via the well documented Wazuh-API,

but i need to fetch events from indexes of wazuh-indexer not fetching alerts from wazuh dashboard or wazuh manger , since the graylog changes the formate of event and make them non compatable to visual in wazuh-dashboard ,but all events store in indexes of wazuh-indexer

Dear brother , I need you very Mutch to end my project and delivery it to my top management , I appreciate your precious time and wish you help me , I can not make a new integration wazuh.py .

swiftbird07 commented 7 months ago

Hm to begin this we need to have both the same software installed do you have a docker compose or something for the solution you are using?

Also is this Graylog sending alerts as well? The main goal of the SOAR is to fetch and then enrich alerts from external sources. I am a bit confused why you are not just using Wazuh like it is?

You can use a simple docker compose file to use Wazuh with all its features.

If you want to try Elastic SIEM (where this SOAR is fully supported already) it’s a bit more complicated to set up normally but at the end you have a full fletched SIEM solution plus EDR capabilities (which Wazuh lacks IMO). I can redirect you to my docker compose file that makes using Elastic SIEM way more easier by just using docker compose and you are ready to go (should be public under my name).

swiftbird07 commented 7 months ago

Here is a link to the Elastic docker compose I mentioned: https://github.com/swiftbird07/pfelk-docker You can ignore all this pfelk stuff (it’s just if you want to natively integrate Elastic SIEM with Pfsense/Opnsense) and just edit the .env and then docker compose up -d

shaker402 commented 7 months ago

thanks so Mutch for your Elastic docker compose it will help me more ,

answer your question "why you are not just using Wazuh like it is?" because , my project is SOC AS A service , and Graylog has many flexible features ,

Why Graylog? With multiple log forwarders currently available, such as filebeat, logstash, NIFI, fluent-bit, etc., why should I pick Graylog?

1- Currently Graylog supports up to OpenSearch 1.3 and Elasticsearch

2- Multiple Inputs Supported Perhaps we want to ingest AWS logs, firewall logs, and endpoint logs.

Index Management Indexes are the building blocks to how Elastic/OpenSearch store data.

Log Normalization Normalizing our data is a must. We need to ensure common data fields that we receive from our logs (no matter the source) are universally mapped so we can create dashboard and alerting standards that apply to all log types. Your future self will love you!

image

No Data Loss Unfortunately, failures to our backend storage can happen. Thankfully Graylog can detect when the backend is in an unhealthy state and write logs to disk until the backend is back online.

finally, you can control and isolate multi-tenant SOC ,Easly

check this serious: https://medium.com/@socfortress/build-your-own-siem-stack-with-open-source-tools-series-39da0f2d412a

just I need to know, is IRIS-SOAR fetch alert from Elasticsearch index ?

look :

How is it going in general for the Wazuh and Graylog setup? For example: Wazuh dashboard are created based on field names with a dot separator and Graylog does not supports this currently. Graylog replaces the dot from the json extractor before it forwards in into the Opensearch index.

for example : wazuh-manager send event "windows.sysmon:event1 * the Gray log change it to "windows_sysmon:event1 "

but in Elasticsearch look image . it support underscore :
image

I have not any problems to replace wazuh-indexer with Elasticsearch , my project will be like your Elastic docker compose but with introduce Grylog as log ingestion and wazuh-manager as log analyzer , exactly like this setup https://www.youtube.com/watch?v=ZO2KSLmB6vY , but just replace wazuh-indexer with elasticsearch

I ask you if you can add wazuh-manager and graylog to your Elastic docker compose the integarion of wazuh-manager graylog and elasticsearch is found her https://www.youtube.com/watch?v=ZO2KSLmB6vY , but i havenot experance with docker to do that

shaker402 commented 7 months ago

dear brother @swiftbird07 , in addition to graylog features ,also , we can make Graylog as backed app to publish new front end SOC solution , because it support Log Normalization to customize backed format and can send log to any front end app

shaker402 commented 7 months ago

Dear bro , kindly , correct me If I'm on the wrong path, please

shaker402 commented 7 months ago

Daer , while Wazuh-Indexer to 4.2 Based on Elasticsearch. But Wazuh-Indexer up to 4.3 Based on OpenSearch .

can I make a new integration wazuh.py and fetch alerts directly from Wazuh-Indexer up to 4.3 ? it is possible

thank you very Mutch for your time .

swiftbird07 commented 7 months ago

Should be possible but I am not sure. I would recommend using Wash API directly if possible. What exactly do you want the SOAR to do? Fetch alerts and send do IRIS?

shaker402 commented 7 months ago

I have import this module to my IRIS-web , and its work successfully , please check this module : https://github.com/socfortress/iris-wazuhindexer-module

but this module access wazuh-indexer indexies to search for IOC , just

but my need is SOAR to do? yes

shaker402 commented 7 months ago

can I have a session with you, please

at any time, appropriate for you

this is my email : shakeralkmali@gmail.com

swiftbird07 commented 7 months ago

IRIS-SOAR is able to fetch alerts from different sources and add them as alerts in DFIR-IRIS. Also it can enrich these (or other existing) alerts in IRIS with context or threat intel data. Do you want to do that? If you have a different use case you may have to look somewhere else.

shaker402 commented 7 months ago

Yes Dear , just I need that ,to have integration with wazuh-indexer , to fetch events from its indices ,

I strongly appreciate your smart SOAR project , and If we need to add additional features will cooperate with you in future if your time allow .

swiftbird07 commented 7 months ago

I am currently quite busy but I will have a look on Sunday.

shaker402 commented 7 months ago

take your time ,Dear swiftbird , thank you very Mutch .

shaker402 commented 7 months ago

Dear Swiftbird,

I hope this message finds you well. I understand that you're quite busy, and I appreciate the time and effort you've put into the open-source community.

I'm reaching out again regarding the issue I'm facing with the integration in my project. Despite my best efforts, I've been unable to resolve it. Your expertise and guidance would be invaluable to me at this point.

I understand that you might still be occupied, but if you could spare some time to look into this, it would mean a lot to me and my project. I'm more than willing to provide any additional information that might assist you in understanding the issue better.

Thank you once again for your time and support.

swiftbird07 commented 7 months ago

I added a branch for developing a wazuh indexer integration. As I am still not quite sure yet what kind of events from Wazuh indexer you want to convert to actual alerts, this integration will fail at the point where the alert instance would be created. Tho until this point you should at least be able to query the Wazuh index (change the new config accordingly).

swiftbird07 commented 7 months ago

Please provide me with concrete sample alerts from the indexer that you want to convert to IRIS alerts and I can help you do that.

shaker402 commented 7 months ago

Index name = wazuh_alert_1_perfix*

sample alerts :

Level_Descriptions

agent_id

agent_name

data_win_eventdata_company

rule_level

rule_description

data_win_eventdata_queryName

image

swiftbird07 commented 7 months ago

Ok and can you get these data so far? If yes we can begin to parse them.

shaker402 commented 7 months ago

:~/IRIS-SOAR-feature-wazuh-indexer-integration# python3 iris-soar.py --start 2024-03-19 23:30:36,836 - isoar - INFO - Starting IRIS-SOAR 2024-03-19 23:30:36,848 - isoar - INFO - Daemon disabled. Starting the main loop (isoar_worker.py) directly... 2024-03-19 23:30:36,909 - isoar_collector - INFO - Started IRIS-SOAR collector script 2024-03-19 23:30:36,909 - isoar_collector - INFO - Checking for new alerts... 2024-03-19 23:30:36,909 - isoar_collector - WARNING - The module elastic_siem is disabled. Skipping. 2024-03-19 23:30:37,077 - isoar_collector - INFO - Calling module wazuh_indexer 2024-03-19 23:30:37,090 - integrations.wazuh_indexer - INFO - irsoar_provide_new_alerts() called. 2024-03-19 23:30:37,091 - isoar_collector - WARNING - The module wazuh_indexer had an unhandled error when trying to provide new alerts. Error: Traceback (most recent call last): File "/root/IRIS-SOAR-feature-wazuh-indexer-integration/integrations/wazuh_indexer.py", line 282, in irsoar_provide_new_alerts verify = config["verify"] KeyError: 'verify'

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/root/IRIS-SOAR-feature-wazuh-indexer-integration/isoar_alert_collector.py", line 130, in main new_alerts = module_import.irsoar_provide_new_alerts(integration_config) File "/root/IRIS-SOAR-feature-wazuh-indexer-integration/integrations/wazuh_indexer.py", line 285, in irsoar_provide_new_alerts mlog.critical("Missing config parameters: " + e) TypeError: can only concatenate str (not "KeyError") to str . Skipping Integration. 2024-03-19 23:30:37,091 - isoar_collector - WARNING - The module ibm_qradar is disabled. Skipping. 2024-03-19 23:30:37,091 - isoar_collector - WARNING - The module matrix_notify is disabled. Skipping. 2024-03-19 23:30:37,092 - isoar_collector - WARNING - The module virus_total is disabled. Skipping. 2024-03-19 23:30:37,092 - isoar_collector - INFO - Finished collector script. root@IRIS:~/IRIS-SOAR-feature-wazuh-indexer-integration#

and the config file is as shown in screenshot : REDACTED

Edit: Removed screenshot containing sensitive information

swiftbird07 commented 7 months ago

It should be fixed now. Be sure to use the new package (see requirements.txt). If you want me to parse your alert please give me the json.

shaker402 commented 7 months ago

thank you very match , bro swiftbird

the connection has been done successfully , now just I have problem with parsing :

note , I used timestamp field , this is only field has Hit .

you ask me for the json , witch json file do you need ?

if you means this "/var/ossec/logs/alerts/alerts.json" , then this file is sent from wazuh-manager to Graylog witch parse its events and store them with new formate at wazuh-indexer indices as shown in the screenshot 👍 image

also se my indecies in wazuh-indexer 👍 image

and bellow my grafana pull events from wazuh-indexer 👍 image

image

image

kindly take a look of the parsing issue bellow : root@IRIS:/IRIS-SOAR-feature-wazuh-indexer-integration# python3 iris-soar.py --start 2024-03-20 19:14:23,638 - isoar - INFO - Starting IRIS-SOAR 2024-03-20 19:14:23,649 - isoar - INFO - Daemon disabled. Starting the main loop (isoar_worker.py) directly... 2024-03-20 19:14:23,722 - isoar_collector - INFO - Started IRIS-SOAR collector script 2024-03-20 19:14:23,722 - isoar_collector - INFO - Checking for new alerts... 2024-03-20 19:14:23,722 - isoar_collector - WARNING - The module elastic_siem is disabled. Skipping. 2024-03-20 19:14:23,757 - isoar_collector - INFO - Calling module wazuh_indexer 2024-03-20 19:14:23,771 - integrations.wazuh_indexer - INFO - irsoar_provide_new_alerts() called. /usr/local/lib/python3.9/dist-packages/elasticsearch/connection/http_urllib3.py:209: UserWarning: Connecting to https://192.168.59.128:9200 using SSL with verify_certs=False is insecure. warnings.warn( 2024-03-20 19:14:23,772 - integrations.wazuh_indexer - INFO - Searching Wazuh-Indexer for: raspberry-pi.home contained within the field name timestamp 2024-03-20 19:14:24,835 - isoar_collector - WARNING - The module wazuh_indexer had an unhandled error when trying to provide new alerts. Error: Traceback (most recent call last): File "/root/IRIS-SOAR-feature-wazuh-indexer-integration/isoar_alert_collector.py", line 130, in main new_alerts = module_import.irsoar_provide_new_alerts(integration_config) File "/root/IRIS-SOAR-feature-wazuh-indexer-integration/integrations/wazuh_indexer.py", line 314, in irsoar_provide_new_alerts res = es.search( File "/usr/local/lib/python3.9/dist-packages/elasticsearch/client/utils.py", line 168, in _wrapped return func(*args, params=params, headers=headers, **kwargs) File "/usr/local/lib/python3.9/dist-packages/elasticsearch/client/init.py", line 1670, in search return self.transport.perform_request( File "/usr/local/lib/python3.9/dist-packages/elasticsearch/transport.py", line 415, in perform_request raise e File "/usr/local/lib/python3.9/dist-packages/elasticsearch/transport.py", line 381, in perform_request status, headers_response, data = connection.perform_request( File "/usr/local/lib/python3.9/dist-packages/elasticsearch/connection/http_urllib3.py", line 277, in perform_request self._raise_error(response.status, raw_data) File "/usr/local/lib/python3.9/dist-packages/elasticsearch/connection/base.py", line 330, in _raise_error raise HTTP_EXCEPTIONS.get(status_code, TransportError)( elasticsearch.exceptions.RequestError: RequestError(400, 'search_phase_execution_exception', 'failed to parse date field [raspberry-pi.home] with format [uuuu-MM-dd HH:mm:ss.SSS]: [failed to parse date field [raspberry-pi.home] with format [uuuu-MM-dd HH:mm:ss.SSS]]') . Skipping Integration. 2024-03-20 19:14:24,835 - isoar_collector - WARNING - The module ibm_qradar is disabled. Skipping. 2024-03-20 19:14:24,836 - isoar_collector - WARNING - The module matrix_notify is disabled. Skipping. 2024-03-20 19:14:24,836 - isoar_collector - WARNING - The module virus_total is disabled. Skipping. 2024-03-20 19:14:24,836 - isoar_collector - INFO - Finished collector script. root@IRIS:~/IRIS-SOAR-feature-wazuh-indexer-integration#

please forgive me for taking so long

shaker402 commented 7 months ago

look the grafana pull all events by using timestamp field witch only hit by your code in branch 2 👍 image

swiftbird07 commented 7 months ago

So the integration is actually for Graylog?

The problem is you are trying to search "raspberry-pi.home" in a timestamp field. Fix: query = "2024-03-21 15:30:45.123" or any other timestamp (tho this will return only events at that exact time).

Grafana gets all data from Graylog and sorts them by timestamp (older to newer I guess). The actual message is _ruledescription.

And you want to query all data from Graylog and add them to IRIS? Because if you filter on timestamp only all results will be returned and not just alerts.

Btw where are you from? I would recommend ChatGPT or DeepL for translation to ensure that we have no miscommunication in the future.

shaker402 commented 7 months ago

Thank you for clarifying that.

I apologize for any mistakes in my previous message. I'm from Yemen and still learning English.

Regarding the JSON file, you mentioned needing one for parsing. Could you tell me which specific JSON file you require?

I'm having an issue When I try to search for any field other than the timestamp, like "agent_names", I get this error message: INFO - Found 0 hits INFO - No new alerts found. INFO - The module wazuh_indexer did not provide any alerts.

just, my requirement from SOAR to automatically do:

  1. Alert Fetching
  2. Alert Processing
  3. Case Processing

I appreciate your help , so mutch Thank you! Thank you!

swiftbird07 commented 7 months ago

I need the event in raw JSON format from Graylog like what is written in blue in the screenshot you gave me

image

Regarding search: The event from above should be available b searching for field rule.id and query 92200.

shaker402 commented 7 months ago

did you mean: Export extractors of wazuh_1 massage :

Currently, I have four extractors configured in Graylog, which are applied to every message received from the wazuh-manager as shown in the first screenshot: image

Our goal is to modify only the first extractor, titled "wazuh_1_extractor_0". which affect the format of all received messages:

image

bellow json file of four extractors

{ "extractors": [ { "title": "wazuh_1_extractor_0", "extractor_type": "json", "converters": [], "order": 0, "cursor_strategy": "copy", "source_field": "message", "target_field": "", "extractor_config": { "list_separator": ", ", "kv_separator": "=", "key_prefix": "", "keyseparator": "", "replace_key_whitespace": false, "key_whitespacereplacement": "" }, "condition_type": "none", "condition_value": "" }, { "title": "rule_groups1", "extractor_type": "split_and_index", "converters": [], "order": 0, "cursor_strategy": "copy", "source_field": "rule_groups", "target_field": "rule_groups1", "extractor_config": { "index": 1, "split_by": "," }, "condition_type": "none", "condition_value": "" }, { "title": "rule_groups2", "extractor_type": "split_and_index", "converters": [], "order": 0, "cursor_strategy": "copy", "source_field": "rule_groups", "target_field": "rule_groups2", "extractor_config": { "index": 2, "split_by": "," }, "condition_type": "none", "condition_value": "" }, { "title": "rule_groups3", "extractor_type": "split_and_index", "converters": [], "order": 0, "cursor_strategy": "copy", "source_field": "rule_groups", "target_field": "rule_groups3", "extractor_config": { "index": 3, "split_by": "," }, "condition_type": "none", "condition_value": "" } ], "version": "5.0.7" }

I appreciate your help!

kindly , correct me If I have mistakes

swiftbird07 commented 7 months ago

We're on the same page regarding receiving the data up to this point. However, since my current setup doesn't mirror yours—lacking Graylog, different Wazuh version, and the like—I'm at a bit of a standstill. I need example JSONs directly from your environment. This is key for me to understand how to correctly parse the data into the Alert format. I'm specifically looking to identify which field corresponds to which part of our Alert class.

Therefore I need you to send me some actual example alerts that the SOAR needs. What I meant with my previous request was this: 314767582-de8404ac-7d91-43ae-8e10-76b5a8a966b2

(make sure to copy the whole valid JSON not just that preview).

I'm navigating this project solo and squeezing it into my free time, so while I'm eager to assist, I need these JSON examples to effectively help out. It's all about making sure we're mapping the data accurately to our predefined structures.

I tried translating this response to Arabic, maybe that helps if you are not so good in English (if this text doesn't make sense please blame ChatGPT :) ):

نحن على نفس الصفحة بخصوص استلام البيانات حتى هذه النقطة. ومع ذلك، بما أن الإعداد الحالي لدي لا يعكس إعدادك - نقص في Graylog، واختلاف في إصدار Wazuh، وما إلى ذلك - أجد نفسي في طريق مسدود. أنا بحاجة إلى أمثلة JSON مباشرة من بيئتك. هذا أساسي بالنسبة لي لفهم كيفية تحليل البيانات بشكل صحيح إلى تنسيق التنبيه. أنا أبحث تحديداً عن تحديد أي حقل يتوافق مع أي جزء من صفّنا للتنبيه.

لذا أحتاج منك إرسال بعض أمثلة التنبيهات الفعلية التي يحتاجها الSOAR. ما كنت أعنيه بطلبي السابق هو هذا: 314767582-de8404ac-7d91-43ae-8e10-76b5a8a966b2

(تأكد من نسخ JSON كاملًا وليس مجرد هذا المعاينة).

أنا أدير هذا المشروع بمفردي وأحاول إيجاد وقت فراغ للعمل عليه، لذا بينما أنا حريص على المساعدة، أحتاج إلى هذه الأمثلة من JSON للمساعدة بفعالية. الأمر كله يتعلق بالتأكد من أننا نقوم بتعيين البيانات بدقة إلى هياكلنا المحددة مسبقًا.

shaker402 commented 7 months ago

noted

shaker402 commented 7 months ago

output.json output_pretty.json

shaker402 commented 7 months ago

this logs is larger than before : output_1000.json output_pretty.json

shaker402 commented 7 months ago

Hi @swiftbird07 ,

I hope this message finds you well. I wanted to express my gratitude for the considerable amount of time and effort you’ve already put into helping me with the IRIS-SOAR enhancement issue. Your expertise and dedication have been invaluable, and many of my requirements have been met thanks to your assistance.

Your help is crucial for me to move forward with this project. I’m looking forward to your help .once your time allow you

Thanks again for all your help so far! I really appreciate your time and effort in working on this enhancement issue with me. You've been incredibly helpful in clarifying things and getting us this far.

swiftbird07 commented 7 months ago

Give me some time I am working on it

swiftbird07 commented 7 months ago

Btw how do you plan on acknowledging alerts? The SOAR needs a way to tell if an alert was already done. In Elastic you could mark an alert as "acknowledged" in QRadar you could set a flag. Is there a way to do this in Graylog?

بالمناسبة، كيف تخطط للاعتراف بالتنبيهات؟ يحتاج نظام SOAR إلى طريقة لمعرفة إذا كان تم التعامل مع تنبيه مسبقًا. في Elastic، يمكنك تمييز تنبيه على أنه "مُعترف به" وفي QRadar يمكنك تعيين علامة. هل هناك طريقة للقيام بذلك في Graylog؟

swiftbird07 commented 7 months ago

Try it out.

shaker402 commented 7 months ago

Sorry for the delay, I was sick ,

I Tried it , but the tested model can not find any fields

and , Graylog currently doesn't have a built-in functionality to directly acknowledge alerts.

and no alerts on Graylog ,because i don't define any events yet image

just I have events in graylog , and I need Upon receiving an event, create a ticket in the IRIS system. image

shaker402 commented 7 months ago

The command I ’am using is a basic Dev tools command that retrieves data from a wazuh_alert_1_perfix_0 index in Graylog. Here’s the command:

"curl -k -XGET "https://user:password@192.168.59.128:9200/wazuh_alert_1_perfix*/_search" -H "Content-Type: application/json" -d "{ \"size\": 10, \"query\": { \"match_all\": {} }}" -o output.json"

This command queries all documents in the wazuh_alert_1_perfix_0 index. By default, Graylog only returns the first 100 matching documents.

You can use this curl command to retrieve events from Graylog and save them to an output.json file. You can then use this file to create alerts in IRIS-WEB.

The bellow screenshot shows the same Dev tools command.

image bellow is cURL command of the same Dev tools command used in screenshot : "curl -k -XGET "https://user:password@192.168.59.128:9200/wazuh_alert_1_perfix*/_search" -H "Content-Type: application/json" -d "{ \"size\": 10, \"query\": { \"match_all\": {} }}" -o output.json"

attached to you "output.json" file for 1000 events output_1000.json

swiftbird07 commented 7 months ago

I Tried it , but the tested model can not find any fields

I used the example json you gave me to match the fields. Did you use the newest version? Please show the log when starting IRIS-SOAR and the config.

shaker402 commented 7 months ago

the logs is : INFO - Found 0 hits INFO - No new alerts found. INFO - The module wazuh_indexer did not provide any alerts.

attached to you configuration file :

image image

shaker402 commented 7 months ago

Sure, I use the newest version, Please guide me if I am running the project wrongly.

Just I wrote the the require confirmation in the config file , and I have tried two valves in Fileds, the first agent.name and The second agent_name

and running the project by start IRIS-SOAR.py

Or , give me your email to give you access to graylog and wazuh-indexed

swiftbird07 commented 7 months ago

Please send me the access details to info[at]swiftbird.de

shaker402 commented 6 months ago

Hi swiftbird,

I hope this massage finds you well.

Did you receive my email. I wanted to check if you had a chance to see it.

Please let me know if you have any questions or need anything further from me.

Thank you vert much for your time and help, dear