pfelk / docker

Deploy pfelk with docker-compose
Apache License 2.0
56 stars 20 forks source link

the volume for host logstash configuration folder does not match the container folder. #29

Closed jcastillo725 closed 3 years ago

jcastillo725 commented 3 years ago

Describe the bug I followed the steps for setting up PFELK on docker but I don't see logs coming in. It looks like the correct container folder for .conf files is in "/usr/share/logstash/pipeline" so I updated the volume in the docker compose yml file and was able to get logs but logstash shuts down after a few seconds.

Original volume container folder: /etc/pfelk/conf.d:ro What I changed it to: /usr/share/logstash/pipeline:ro

To Reproduce Steps to reproduce the behavior:

  1. Install fresh ELK on docker using latest version

Screenshots If applicable, add screenshots to help explain your problem.

Operating System (please complete the following information):

Elasticsearch, Logstash, Kibana (please complete the following information):

Additional context Add any other context about the problem here.

a3ilson commented 3 years ago

Take a look at issue #28 - I plan to reorganize but am also fiddling merging the docker and host installation into one repo.

jcastillo725 commented 3 years ago

Logstash doesn’t see any configuration file on the path “/etc/pfelk/conf.d/*.conf”

logstash | [INFO ] 2021-03-25 03:03:17.111 [Agent thread] configpathloader - No config files found in path {:path=>"/etc/pfelk/conf.d/*.conf"}

The host folder it’s referencing to is “./etc/logstash/conf.d/”

But it does not contain the .conf files.

image

The .conf files are located in “/etc/pfelk/conf.d” in the host (This is where it was extracted when I unzipped pfelkdocker.zip) image

I copied the files to “./etc/logstash/conf.d/” and ran into some filter errors. I copied databases and patterns to the logstash/conf.d folder and got rid of the filter errors but now I'm getting different messages:

elastic@ubuntu:~/pfelk/etc/pfelk/conf.d$ sudo docker attach logstash [WARN ] 2021-03-25 03:55:35.576 [Converge PipelineAction::Create] elasticsearch - Relying on default value of pipeline.ecs_compatibility, which may change in a future major release of Logstash. To avoid unexpected changes when upgrading Logstash, please explicitly declare your desired ECS Compatibility mode. [WARN ] 2021-03-25 03:55:35.614 [Converge PipelineAction::Create] elasticsearch - Relying on default value of pipeline.ecs_compatibility, which may change in a future major release of Logstash. To avoid unexpected changes when upgrading Logstash, please explicitly declare your desired ECS Compatibility mode. [WARN ] 2021-03-25 03:55:35.642 [Converge PipelineAction::Create] elasticsearch - Relying on default value of pipeline.ecs_compatibility, which may change in a future major release of Logstash. To avoid unexpected changes when upgrading Logstash, please explicitly declare your desired ECS Compatibility mode. [WARN ] 2021-03-25 03:55:35.675 [Converge PipelineAction::Create] elasticsearch - Relying on default value of pipeline.ecs_compatibility, which may change in a future major release of Logstash. To avoid unexpected changes when upgrading Logstash, please explicitly declare your desired ECS Compatibility mode. [WARN ] 2021-03-25 03:55:35.710 [Converge PipelineAction::Create] elasticsearch - Relying on default value of pipeline.ecs_compatibility, which may change in a future major release of Logstash. To avoid unexpected changes when upgrading Logstash, please explicitly declare your desired ECS Compatibility mode. [WARN ] 2021-03-25 03:55:35.736 [Converge PipelineAction::Create] elasticsearch - Relying on default value of pipeline.ecs_compatibility, which may change in a future major release of Logstash. To avoid unexpected changes when upgrading Logstash, please explicitly declare your desired ECS Compatibility mode. [WARN ] 2021-03-25 03:55:35.772 [Converge PipelineAction::Create] elasticsearch - Relying on default value of pipeline.ecs_compatibility, which may change in a future major release of Logstash. To avoid unexpected changes when upgrading Logstash, please explicitly declare your desired ECS Compatibility mode. [WARN ] 2021-03-25 03:55:35.789 [Converge PipelineAction::Create] elasticsearch - Relying on default value of pipeline.ecs_compatibility, which may change in a future major release of Logstash. To avoid unexpected changes when upgrading Logstash, please explicitly declare your desired ECS Compatibility mode. [WARN ] 2021-03-25 03:55:35.800 [Converge PipelineAction::Create] elasticsearch - Relying on default value of pipeline.ecs_compatibility, which may change in a future major release of Logstash. To avoid unexpected changes when upgrading Logstash, please explicitly declare your desired ECS Compatibility mode. [INFO ] 2021-03-25 03:55:35.836 [[pfelk]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://es01:9200/]}} [WARN ] 2021-03-25 03:55:35.844 [[pfelk]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://es01:9200/"} [INFO ] 2021-03-25 03:55:35.860 [[pfelk]-pipeline-manager] elasticsearch - ES Output version determined {:es_version=>7} [WARN ] 2021-03-25 03:55:35.860 [[pfelk]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7} [INFO ] 2021-03-25 03:55:35.917 [[pfelk]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://es01:9200"]} [INFO ] 2021-03-25 03:55:35.938 [[pfelk]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://es01:9200/]}} [WARN ] 2021-03-25 03:55:35.957 [[pfelk]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://es01:9200/"} [INFO ] 2021-03-25 03:55:35.984 [[pfelk]-pipeline-manager] elasticsearch - ES Output version determined {:es_version=>7} [WARN ] 2021-03-25 03:55:36.002 [[pfelk]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7} [INFO ] 2021-03-25 03:55:36.057 [[pfelk]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://es01:9200"]} [INFO ] 2021-03-25 03:55:36.064 [[pfelk]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://es01:9200/]}} [WARN ] 2021-03-25 03:55:36.078 [[pfelk]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://es01:9200/"} [INFO ] 2021-03-25 03:55:36.089 [[pfelk]-pipeline-manager] elasticsearch - ES Output version determined {:es_version=>7} [WARN ] 2021-03-25 03:55:36.105 [[pfelk]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7} [INFO ] 2021-03-25 03:55:36.154 [[pfelk]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://es01:9200"]} [INFO ] 2021-03-25 03:55:36.159 [[pfelk]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://es01:9200/]}} [WARN ] 2021-03-25 03:55:36.174 [[pfelk]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://es01:9200/"} [INFO ] 2021-03-25 03:55:36.201 [[pfelk]-pipeline-manager] elasticsearch - ES Output version determined {:es_version=>7} [WARN ] 2021-03-25 03:55:36.208 [[pfelk]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7} [INFO ] 2021-03-25 03:55:36.275 [[pfelk]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://es01:9200"]} [INFO ] 2021-03-25 03:55:36.283 [[pfelk]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://es01:9200/]}} [WARN ] 2021-03-25 03:55:36.317 [[pfelk]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://es01:9200/"} [INFO ] 2021-03-25 03:55:36.323 [[pfelk]-pipeline-manager] elasticsearch - ES Output version determined {:es_version=>7} [WARN ] 2021-03-25 03:55:36.324 [[pfelk]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7} [INFO ] 2021-03-25 03:55:36.369 [[pfelk]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://es01:9200"]} [INFO ] 2021-03-25 03:55:36.379 [[pfelk]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://es01:9200/]}} [WARN ] 2021-03-25 03:55:36.402 [[pfelk]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://es01:9200/"} [INFO ] 2021-03-25 03:55:36.433 [[pfelk]-pipeline-manager] elasticsearch - ES Output version determined {:es_version=>7} [WARN ] 2021-03-25 03:55:36.442 [[pfelk]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7} [INFO ] 2021-03-25 03:55:36.471 [[pfelk]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://es01:9200"]} [INFO ] 2021-03-25 03:55:36.477 [[pfelk]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://es01:9200/]}} [WARN ] 2021-03-25 03:55:36.488 [[pfelk]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://es01:9200/"} [INFO ] 2021-03-25 03:55:36.508 [[pfelk]-pipeline-manager] elasticsearch - ES Output version determined {:es_version=>7} [WARN ] 2021-03-25 03:55:36.509 [[pfelk]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7} [INFO ] 2021-03-25 03:55:36.551 [[pfelk]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://es01:9200"]} [INFO ] 2021-03-25 03:55:36.556 [[pfelk]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://es01:9200/]}} [WARN ] 2021-03-25 03:55:36.580 [[pfelk]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://es01:9200/"} [INFO ] 2021-03-25 03:55:36.606 [[pfelk]-pipeline-manager] elasticsearch - ES Output version determined {:es_version=>7} [WARN ] 2021-03-25 03:55:36.606 [[pfelk]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7} [INFO ] 2021-03-25 03:55:36.651 [[pfelk]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://es01:9200"]} [INFO ] 2021-03-25 03:55:36.656 [[pfelk]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://es01:9200/]}} [WARN ] 2021-03-25 03:55:36.677 [[pfelk]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://es01:9200/"} [INFO ] 2021-03-25 03:55:36.683 [[pfelk]-pipeline-manager] elasticsearch - ES Output version determined {:es_version=>7} [WARN ] 2021-03-25 03:55:36.683 [[pfelk]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7} [INFO ] 2021-03-25 03:55:36.737 [[pfelk]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://es01:9200"]} [INFO ] 2021-03-25 03:55:36.779 [Ruby-0-Thread-31: /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.1-java/lib/logstash/plugin_mixins/elasticsearch/common.rb:130] elasticsearch - Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled} [INFO ] 2021-03-25 03:55:36.832 [Ruby-0-Thread-31: /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.1-java/lib/logstash/plugin_mixins/elasticsearch/common.rb:130] elasticsearch - Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}} [INFO ] 2021-03-25 03:55:36.974 [[pfelk]-pipeline-manager] geoip - Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-6.0.3-java/vendor/GeoLite2-City.mmdb"} [INFO ] 2021-03-25 03:55:37.049 [[pfelk]-pipeline-manager] geoip - Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-6.0.3-java/vendor/GeoLite2-City.mmdb"} [INFO ] 2021-03-25 03:55:37.125 [[pfelk]-pipeline-manager] geoip - Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-6.0.3-java/vendor/GeoLite2-ASN.mmdb"} [INFO ] 2021-03-25 03:55:37.286 [[pfelk]-pipeline-manager] geoip - Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-6.0.3-java/vendor/GeoLite2-ASN.mmdb"} [ERROR] 2021-03-25 03:55:37.321 [[pfelk]-pipeline-manager] javapipeline - Pipeline error {:pipeline_id=>"pfelk", :exception=>#<LogStash::Filters::Dictionary::DictionaryFileError: Translate: Missing or stray quote in line 1 when loading dictionary file at /etc/pfelk/databases/service-names-port-numbers.csv>, :backtrace=>["uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/csv.rb:1899:in block in shift'", "org/jruby/RubyArray.java:1809:ineach'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/csv.rb:1867:in block in shift'", "org/jruby/RubyKernel.java:1442:inloop'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/csv.rb:1821:in shift'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/dictionary/csv_file.rb:20:inblock in read_file_into_dictionary'", "org/jruby/RubyIO.java:3511:in foreach'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/dictionary/csv_file.rb:18:inread_file_into_dictionary'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/dictionary/file.rb:101:in merge_dictionary'", "org/jruby/RubyMethod.java:115:incall'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/dictionary/file.rb:66:in load_dictionary'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/dictionary/file.rb:53:ininitialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/dictionary/file.rb:19:in create'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/translate.rb:166:inregister'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:75:in register'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:228:inblock in register_plugins'", "org/jruby/RubyArray.java:1809:in each'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:227:inregister_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:586:in maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:240:instart_workers'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:185:in run'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:137:inblock in start'"], "pipeline.sources"=>["/etc/pfelk/conf.d/01-inputs.conf", "/etc/pfelk/conf.d/02-types.conf", "/etc/pfelk/conf.d/03-filter.conf", "/etc/pfelk/conf.d/05-apps.conf", "/etc/pfelk/conf.d/20-interfaces.conf", "/etc/pfelk/conf.d/30-geoip.conf", "/etc/pfelk/conf.d/35-rules-desc.conf", "/etc/pfelk/conf.d/36-ports-desc.conf", "/etc/pfelk/conf.d/37-enhanced_user_agent.conf", "/etc/pfelk/conf.d/38-enhanced_url.conf", "/etc/pfelk/conf.d/45-cleanup.conf", "/etc/pfelk/conf.d/49-enhanced_private.conf", "/etc/pfelk/conf.d/50-outputs.conf"], :thread=>"#<Thread:0x7d02add9@/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:125 run>"} [INFO ] 2021-03-25 03:55:37.323 [[pfelk]-pipeline-manager] javapipeline - Pipeline terminated {"pipeline.id"=>"pfelk"} [ERROR] 2021-03-25 03:55:37.331 [Converge PipelineAction::Create] agent - Failed to execute action {:id=>:pfelk, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create, action_result: false", :backtrace=>nil} [INFO ] 2021-03-25 03:55:37.445 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600} [INFO ] 2021-03-25 03:55:38.644 [[.monitoring-logstash]-pipeline-manager] javapipeline - Pipeline terminated {"pipeline.id"=>".monitoring-logstash"} [INFO ] 2021-03-25 03:55:39.417 [LogStash::Runner] runner - Logstash shut down.

a3ilson commented 3 years ago

Where is the docker-compose.yml file located in reference to the pfelk files?

The docker-compose.yml is referencing those paths based on a relative path (i.e. the preceding dot). Based on your previous response your path is /home/elastic/pfelk/ so to get docker-compose to recognize this path the docker compose will need to be placed within /home/elastic/pfelk/

/home/elastic/pfelk/
├── docker-compose.yml
├── conf.d
│   ├── 01-inputs.conf
│   ├── 02-types.conf
│   ├── 03-filter.conf
│   ├── 05-apps.conf
│   ├── 20-interfaces.conf
│   ├── 30-geoip.conf
│   ├── 35-rules-desc.conf
│   ├── 36-ports-desc.conf
│   ├── 45-cleanup.conf
│   └── 50-outputs.conf
├── config
│   ├── logstash.yml
│   └── pipelines.yml
├── databases
│   ├── private-hostnames.csv
│   ├── rule-names.csv
│   └── service-names-port-numbers.csv
└── patterns
    ├── openvpn.grok
    └── pfelk.grok

Otherwise and alternatively you may amend the docker-compose.yml to specify the absolute path:

      - /home/elastic/pfelk/etc/logstash/config/:/usr/share/logstash/config:ro       
      - /home/elastic/pfelk/etc/logstash/conf.d/:/etc/pfelk/conf.d:ro
      - /home/elastic/pfelk/etc/logstash/conf.d/patterns/:/etc/pfelk/patterns:ro
      - /home/elastic/pfelk/etc/logstash/conf.d/databases/:/etc/pfelk/databases:ro

Note: the preceding dot was omitted, specifying the absolute path versus relative path

Linux Paths: / absolute path . Relative path - current directory .. Relative path - parent directory

jcastillo725 commented 3 years ago

But the extracted the folder /etc/logstash does not contain the subfolder conf.d

image

The directory /home/elastic/pfelk/ is what I created to unzip the file and the created directory looks more like:

/home/elastic/pfelk/ ├── docker-compose.yml ├── elasticsearch │ ├── Dockerfile ├── kibana │ ├── Dockerfile ├── logstash │ ├── Dockerfile ├── etc │ ├── logstash │ │ ├── config │ │ │ ├── logstash.yml │ │ │ └── pipelines.yml │ ├── pfelk │ │ ├── conf.d │ │ │ ├── 01-inputs.conf │ │ │ ├── 02-types.conf │ │ │ ├── 03-filter.conf │ │ │ ├── 05-apps.conf │ │ │ ├── 20-interfaces.conf │ │ │ ├── 30-geoip.conf │ │ │ ├── 35-rules-desc.conf │ │ │ ├── 36-ports-desc.conf │ │ │ ├── 45-cleanup.conf │ │ │ └── 50-outputs.conf │ │ ├── databases │ │ │ ├── private-hostnames.csv │ │ │ ├── rule-names.csv │ │ │ └── service-names-port-numbers.csv │ │ ├── patterns │ │ │ ├── openvpn.grok (Missing) │ │ │ └── pfelk.grok

so the docker-compes.yml is indeed in /home/elastic/pfelk together with the other extracted folders described above but the .conf.d folder is not in the ./etc/logstash folder but in ./etc/pfelk while the pipeline.yml is pointing it to ./etc/logstash.

image image image image

btw please forgive me for my stubbornness and thanks for your patience... I know there's just something that I still don't understand with what you're explaining but can't see it yet =(

a3ilson commented 3 years ago

Got it...the docker-compose.yml had an incorrect reference which was corrected.

I would download or update to the latest docker-compose.yml and try again - sorry for the inconvenience.

The pipelines

jcastillo725 commented 3 years ago

Not an inconvenience at all bro . The work you're putting on this is awesome!

The last time I copied the files to what I thought to be the correct references Logstash shutdown again with this error:

[ERROR] 2021-03-25 03:55:37.331 [Converge PipelineAction::Create] agent - Failed to execute action {:id=>:pfelk, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create, action_result: false", :backtrace=>nil}

The whole log is in my 2nd comment. I will try again with the updated zip file later or tomorrow.

jcastillo725 commented 3 years ago

Just a few more minor adjustments for the following:

      - ./etc/pfelk/conf.d/patterns/:/etc/pfelk/patterns:ro
      - ./etc/pfelk/conf.d/databases/:/etc/pfelk/databases:ro

[ERROR] 2021-03-26 04:52:51.453 [Converge PipelineAction::Create] translate - Invalid setting for translate filter plugin:

filter { translate {

This setting must be a path

  # File does not exist or cannot be opened /etc/pfelk/databases/rule-names.csv
  dictionary_path => "/etc/pfelk/databases/rule-names.csv"
  ...
}

}

The databases and patterns folders within conf.d do not have the files.. The files are in the databases and patterns folders with pfelk folder alongside the conf.d folder.

So I copied the files to where they are referencing and seems to wrok but got this new error now:

[ERROR] 2021-03-26 05:07:13.699 [[pfelk]-pipeline-manager] javapipeline - Pipeline error {:pipeline_id=>"pfelk", :exception=>#<LogStash::Filters::Dictionary::DictionaryFileError: Translate: Missing or stray quote in line 1 when loading dictionary file at /etc/pfelk/databases/service-names-port-numbers.csv>, :backtrace=>["uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/csv.rb:1899:in block in shift'", "org/jruby/RubyArray.java:1809:ineach'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/csv.rb:1867:in block in shift'", "org/jruby/RubyKernel.java:1442:inloop'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/csv.rb:1821:in shift'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/dictionary/csv_file.rb:20:inblock in read_file_into_dictionary'", "org/jruby/RubyIO.java:3511:in foreach'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/dictionary/csv_file.rb:18:inread_file_into_dictionary'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/dictionary/file.rb:101:in merge_dictionary'", "org/jruby/RubyMethod.java:115:incall'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/dictionary/file.rb:66:in load_dictionary'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/dictionary/file.rb:53:ininitialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/dictionary/file.rb:19:in create'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/translate.rb:166:inregister'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:75:in register'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:228:inblock in register_plugins'", "org/jruby/RubyArray.java:1809:in each'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:227:inregister_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:586:in maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:240:instart_workers'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:185:in run'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:137:inblock in start'"], "pipeline.sources"=>["/etc/pfelk/conf.d/01-inputs.conf", "/etc/pfelk/conf.d/02-types.conf", "/etc/pfelk/conf.d/03-filter.conf", "/etc/pfelk/conf.d/05-apps.conf", "/etc/pfelk/conf.d/20-interfaces.conf", "/etc/pfelk/conf.d/30-geoip.conf", "/etc/pfelk/conf.d/35-rules-desc.conf", "/etc/pfelk/conf.d/36-ports-desc.conf", "/etc/pfelk/conf.d/37-enhanced_user_agent.conf", "/etc/pfelk/conf.d/38-enhanced_url.conf", "/etc/pfelk/conf.d/45-cleanup.conf", "/etc/pfelk/conf.d/49-enhanced_private.conf", "/etc/pfelk/conf.d/50-outputs.conf"], :thread=>"#<Thread:0x652a68db@/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:125 run>"}

PS: Can you check the new pfelkdocker.zip. It looks like it still has the old docker-compose.yml volumes

a3ilson commented 3 years ago

The zip file was updated and I just tested on my system - working w/o issues.

jcastillo725 commented 3 years ago

hmm that's weird. I downloaded the new one, unzipped and still had to copy the databases and patterns from /pflek to /pfelk/conf.d. Anyway I was still having the errors with translate on service-names-port-numbers.csv and rule-names.csv. I ended up removing filters 35 and 36 and it worked.

I think the errors were weird though because I checked both CSVs and there were no extra quotation marks (") whatsoever. It's working now but unfortunately, I can't use the 2 enrichments but I guess its really ok.

See logs and screenshots below for reference:

Errors:

Error: Translate: Missing or stray quote in line 1 when loading dictionary file at /etc/pfelk/databases/service-names-port-numbers.csv>, :backtrace=>["uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/csv.rb:1899:in block in shift'", "org/jruby/RubyArray.java:1809:ineach'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/csv.rb:1867:in block in shift'", "org/jruby/RubyKernel.java:1442:inloop'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/csv.rb:1821:in shift'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/dictionary/csv_file.rb:20:inblock in read_file_into_dictionary'", "org/jruby/RubyIO.java:3511:in foreach'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/dictionary/csv_file.rb:18:inread_file_into_dictionary'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/dictionary/file.rb:101:in merge_dictionary'", "org/jruby/RubyMethod.java:115:incall'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/dictionary/file.rb:66:in load_dictionary'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/dictionary/file.rb:53:ininitialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/dictionary/file.rb:19:in create'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/translate.rb:166:inregister'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:75:in register'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:228:inblock in register_plugins'", "org/jruby/RubyArray.java:1809:in each'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:227:inregister_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:586:in maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:240:instart_workers'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:185:in run'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:137:inblock in start'"], "pipeline.sources"=>["/etc/pfelk/conf.d/01-inputs.conf", "/etc/pfelk/conf.d/02-types.conf", "/etc/pfelk/conf.d/03-filter.conf", "/etc/pfelk/conf.d/05-apps.conf", "/etc/pfelk/conf.d/20-interfaces.conf", "/etc/pfelk/conf.d/30-geoip.conf", "/etc/pfelk/conf.d/35-rules-desc.conf", "/etc/pfelk/conf.d/36-ports-desc.conf", "/etc/pfelk/conf.d/37-enhanced_user_agent.conf", "/etc/pfelk/conf.d/38-enhanced_url.conf", "/etc/pfelk/conf.d/45-cleanup.conf", "/etc/pfelk/conf.d/49-enhanced_private.conf", "/etc/pfelk/conf.d/50-outputs.conf"], :thread=>"#<Thread:0x75350788@/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:125 run>"}

[ERROR] 2021-03-26 15:24:19.829 [[pfelk]-pipeline-manager] javapipeline - Pipeline error {:pipeline_id=>"pfelk", :exception=>#<LogStash::Filters::Dictionary::DictionaryFileError: Translate: Missing or stray quote in line 1 when loading dictionary file at /etc/pfelk/databases/rule-names.csv>, :backtrace=>["uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/csv.rb:1899:in block in shift'", "org/jruby/RubyArray.java:1809:ineach'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/csv.rb:1867:in block in shift'", "org/jruby/RubyKernel.java:1442:inloop'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/csv.rb:1821:in shift'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/dictionary/csv_file.rb:20:inblock in read_file_into_dictionary'", "org/jruby/RubyIO.java:3511:in foreach'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/dictionary/csv_file.rb:18:inread_file_into_dictionary'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/dictionary/file.rb:101:in merge_dictionary'", "org/jruby/RubyMethod.java:115:incall'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/dictionary/file.rb:66:in load_dictionary'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/dictionary/file.rb:53:ininitialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/dictionary/file.rb:19:in create'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-translate-3.2.3/lib/logstash/filters/translate.rb:166:inregister'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:75:in register'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:228:inblock in register_plugins'", "org/jruby/RubyArray.java:1809:in each'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:227:inregister_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:586:in maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:240:instart_workers'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:185:in run'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:137:inblock in start'"], "pipeline.sources"=>["/etc/pfelk/conf.d/01-inputs.conf", "/etc/pfelk/conf.d/02-types.conf", "/etc/pfelk/conf.d/03-filter.conf", "/etc/pfelk/conf.d/05-apps.conf", "/etc/pfelk/conf.d/20-interfaces.conf", "/etc/pfelk/conf.d/30-geoip.conf", "/etc/pfelk/conf.d/35-rules-desc.conf", "/etc/pfelk/conf.d/37-enhanced_user_agent.conf", "/etc/pfelk/conf.d/38-enhanced_url.conf", "/etc/pfelk/conf.d/45-cleanup.conf", "/etc/pfelk/conf.d/49-enhanced_private.conf", "/etc/pfelk/conf.d/50-outputs.conf"], :thread=>"#<Thread:0x68a190ce@/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:125 run>"}

Renamed the extensions of filters 35 and 36 from .conf to .copy image

Logstash working after the 2 filters were disabled. image

Thanks for the support!

a3ilson commented 3 years ago

Let me test it again on a fresh instance (purge all docker containers/volumes)... I had issues before with the database look-ups but my current setup is running fine with them.

jcastillo725 commented 3 years ago

Sorry this is off topic, for suricata do I need to install syslog-ng or the logs will be sent to the firewall's system logs? currently, I don't see surricata logs coming in.

a3ilson commented 3 years ago

@jcastillo725 - that depends are you running OPNsense or pfSense?

OPNsense is the simplest as it utilizes syslog-ng natively. pfSense is a bit wonky but this guide should help it getting it configured and setup

jcastillo725 commented 3 years ago

tcp("logstash.local" do i change this to the ip since its on another host? I think I did but still not getting logs

a3ilson commented 3 years ago

This screenshot is from issue #276 running pfSense 2.5.0. I currently use OPNsense but I know multiple people have been able to get it working with the provided wiki instructions.

I would apply the following:}

{
   tcp("logstash.local"
   port(5040)
   );
};

and amend logstash.local with the host or IP of where pfelk is installed.

109327219-174dd800-7826-11eb-83c4-784989fa3182

jcastillo725 commented 3 years ago

I ended up using opnsense instead and got all the logs. Have you had the chance to test the lookups on a fresh instance?

a3ilson commented 3 years ago

I am currently running with a fresh instance of:

I plan to merge the two repos and squash the docker repo but that'll be a future endeavor (i.e. once I have additional free time). Let me know if you need or want assistance with setting up this method.

jcastillo725 commented 3 years ago

And you're not getting any errors for service-names-port-numbers.csv and rule-names.csv? I will try that on another VM but I think we can close this issue now.