elastic / integrations

Elastic Integrations
https://www.elastic.co/integrations
Other
30 stars 447 forks source link

[ti_crowdstrike]: integration degraded after update to 8.15.0 and 1.1.3 #10850

Open buzzdeee opened 3 months ago

buzzdeee commented 3 months ago

Integration Name

CrowdStrike Falcon Intelligence [ti_crowdstrike]

Dataset Name

any

Integration Version

1.1.3

Agent Version

8.15.0

Agent Output Type

elasticsearch

Elasticsearch Version

8.15.0

OS Version and Architecture

Ubuntu 20.04 LTS

Software/API Version

CrowdStrike

Error Message

image

Event Original

No response

What did you do?

update ELK stack and agents from 8.14.2 to 8.15.0

What did you see?

see error message

What did you expect to see?

should just connect and fetch intel/ioc

Anything else?

image image activated API scopes

elasticmachine commented 3 months ago

Pinging @elastic/security-service-integrations (Team:Security-Service Integrations)

buzzdeee commented 3 months ago

However, in the fleet agent view, looking at the cel logs I see:

12:53:15.683 elastic_agent [elastic_agent][debug] got check-in for endpoint service, tearingDown: false, ignoreCheckins: false
12:53:15.684 elastic_agent [elastic_agent][debug] observed check-in for endpoint service: token:"bd5ebb74-56fa-4fd0-9267-f1be64104752" units:{id:"endpoint-default-6503bf2c-ec03-411b-ac06-6c763c4ca53a" config_state_idx:2 state:HEALTHY message:"Applied policy {6503bf2c-ec03-411b-ac06-6c763c4ca53a}" payload:{fields:{key:"error" value:{struct_value:{fields:{key:"code" value:{number_value:0}} fields:{key:"message" value:{string_value:"Success"}}}}}}} units:{id:"endpoint-default" type:OUTPUT config_state_idx:1 state:HEALTHY message:"Applied policy {6503bf2c-ec03-411b-ac06-6c763c4ca53a}" payload:{fields:{key:"error" value:{struct_value:{fields:{key:"code" value:{number_value:0}} fields:{key:"message" value:{string_value:"Success"}}}}}}} version_info:{name:"Endpoint" build_hash:"f83777fc43e8500f34d20265cced453baa51aa0a"} features_idx:2 pid:216897
12:53:35.684 elastic_agent [elastic_agent][debug] got check-in for endpoint service, tearingDown: false, ignoreCheckins: false
12:53:35.685 elastic_agent [elastic_agent][debug] observed check-in for endpoint service: token:"bd5ebb74-56fa-4fd0-9267-f1be64104752" units:{id:"endpoint-default-6503bf2c-ec03-411b-ac06-6c763c4ca53a" config_state_idx:2 state:HEALTHY message:"Applied policy {6503bf2c-ec03-411b-ac06-6c763c4ca53a}" payload:{fields:{key:"error" value:{struct_value:{fields:{key:"code" value:{number_value:0}} fields:{key:"message" value:{string_value:"Success"}}}}}}} units:{id:"endpoint-default" type:OUTPUT config_state_idx:1 state:HEALTHY message:"Applied policy {6503bf2c-ec03-411b-ac06-6c763c4ca53a}" payload:{fields:{key:"error" value:{struct_value:{fields:{key:"code" value:{number_value:0}} fields:{key:"message" value:{string_value:"Success"}}}}}}} version_info:{name:"Endpoint" build_hash:"f83777fc43e8500f34d20265cced453baa51aa0a"} features_idx:2 pid:216897
buzzdeee commented 3 months ago

the ti_crowdstrike.ioc data_stream.dataset is now filled with messages with error.message:

[
  failed eval: ERROR: <input>:21:26: no extremum of empty list
   |   resp.StatusCode == 200 ?
   | .........................^,
  Processor json with tag json_event_original in pipeline logs-ti_crowdstrike.ioc-1.1.3 failed with message: field [original] not present as part of path [event.original]
]
NateUT99 commented 3 months ago

We currently have a case open for this same issue.

jamiehynds commented 3 months ago

Thanks for reporting @NateUT99 and @buzzdeee. We're currently working on a fix, PR here - https://github.com/elastic/integrations/pull/10861

kcreddy commented 3 months ago

The PR https://github.com/elastic/integrations/pull/10861 is now merged. The fix is available in 1.1.4 integration version.

NateUT99 commented 3 months ago

Fix looks good here; thanks!

buzzdeee commented 2 months ago

looks good as well. But don't see any incoming new IoCs or Intel? Actually, the last incoming IoC or Intel I see is already a month ago, might be a different issue, maybe because of #10214 ?