ntop / PF_RING

High-speed packet processing framework
http://www.ntop.org
GNU Lesser General Public License v2.1
2.69k stars 352 forks source link

When run suricata with pf_ring zc mode suricata did not try to connect redis. #515

Closed south-devel closed 3 years ago

south-devel commented 5 years ago

OS : CentOS Linux release 7.7.1908 kernel : 3.10.0-1062.el7.x86_64 Suricata : 4.1.4 RELEASE

  1. run suricata with pf_ring zc. [root@localhost logstash]# PF_RING_FT_CONF=/etc/pf_ring/ft-rules.conf suricata --pfring-int=zc:ens1f0 -c /etc/suricata/suricata.yaml 24/9/2019 -- 17:15:42 - - This is Suricata version 4.1.4 RELEASE 24/9/2019 -- 17:15:42 - - CPUs/cores online: 4 24/9/2019 -- 17:15:42 - - luajit states preallocated: 128 24/9/2019 -- 17:15:42 - - 'default' server has 'request-body-minimal-inspect-size' set to 32133 and 'request-body-inspect-window' set to 3959 after randomization. 24/9/2019 -- 17:15:42 - - 'default' server has 'response-body-minimal-inspect-size' set to 41880 and 'response-body-inspect-window' set to 16890 after randomization. 24/9/2019 -- 17:15:42 - - SMB stream depth: 0 24/9/2019 -- 17:15:42 - - Protocol detection and parser disabled for modbus protocol. 24/9/2019 -- 17:15:42 - - Protocol detection and parser disabled for enip protocol. 24/9/2019 -- 17:15:42 - - Protocol detection and parser disabled for DNP3. 24/9/2019 -- 17:15:42 - - allocated 262144 bytes of memory for the host hash... 4096 buckets of size 64 24/9/2019 -- 17:15:42 - - preallocated 1000 hosts of size 136 24/9/2019 -- 17:15:42 - - host memory usage: 398144 bytes, maximum: 33554432 24/9/2019 -- 17:15:42 - - Max dump is 0 24/9/2019 -- 17:15:42 - - Core dump setting attempted is 0 24/9/2019 -- 17:15:42 - - Core dump size set to 0 24/9/2019 -- 17:15:42 - - allocated 3670016 bytes of memory for the defrag hash... 65536 buckets of size 56 24/9/2019 -- 17:15:42 - - preallocated 65535 defrag trackers of size 160 24/9/2019 -- 17:15:42 - - defrag memory usage: 14155616 bytes, maximum: 33554432 24/9/2019 -- 17:15:42 - - stream "prealloc-sessions": 2048 (per thread) 24/9/2019 -- 17:15:42 - - stream "memcap": 67108864 24/9/2019 -- 17:15:42 - - stream "midstream" session pickups: disabled 24/9/2019 -- 17:15:42 - - stream "async-oneside": disabled 24/9/2019 -- 17:15:42 - - stream "checksum-validation": enabled 24/9/2019 -- 17:15:42 - - stream."inline": disabled 24/9/2019 -- 17:15:42 - - stream "bypass": disabled 24/9/2019 -- 17:15:42 - - stream "max-synack-queued": 5 24/9/2019 -- 17:15:42 - - stream.reassembly "memcap": 268435456 24/9/2019 -- 17:15:42 - - stream.reassembly "depth": 1048576 24/9/2019 -- 17:15:42 - - stream.reassembly "toserver-chunk-size": 2618 24/9/2019 -- 17:15:42 - - stream.reassembly "toclient-chunk-size": 2519 24/9/2019 -- 17:15:42 - - stream.reassembly.raw: enabled 24/9/2019 -- 17:15:42 - - stream.reassembly "segment-prealloc": 2048 24/9/2019 -- 17:15:42 - - enabling 'eve-log' module 'alert' 24/9/2019 -- 17:15:42 - - enabling 'eve-log' module 'http' 24/9/2019 -- 17:15:42 - - enabling 'eve-log' module 'dns' 24/9/2019 -- 17:15:42 - - enabling 'eve-log' module 'tls' 24/9/2019 -- 17:15:42 - - enabling 'eve-log' module 'files' 24/9/2019 -- 17:15:42 - - forcing magic lookup for logged files 24/9/2019 -- 17:15:42 - - forcing sha256 calculation for logged or stored files 24/9/2019 -- 17:15:42 - - enabling 'eve-log' module 'smtp' 24/9/2019 -- 17:15:42 - - enabling 'eve-log' module 'nfs' 24/9/2019 -- 17:15:42 - - enabling 'eve-log' module 'smb' 24/9/2019 -- 17:15:42 - - enabling 'eve-log' module 'tftp' 24/9/2019 -- 17:15:42 - - enabling 'eve-log' module 'ikev2' 24/9/2019 -- 17:15:42 - - enabling 'eve-log' module 'krb5' 24/9/2019 -- 17:15:42 - - enabling 'eve-log' module 'dhcp' 24/9/2019 -- 17:15:42 - - enabling 'eve-log' module 'ssh' 24/9/2019 -- 17:15:42 - - enabling 'eve-log' module 'stats' 24/9/2019 -- 17:15:42 - - [ERRCODE: SC_WARN_EVE_MISSING_EVENTS(318)] - eve.stats will not display all decoder events correctly. See #2225. Set a prefix in stats.decoder-events-prefix. In 5.0 the prefix will default to 'decoder.event'. 24/9/2019 -- 17:15:42 - - enabling 'eve-log' module 'flow' 24/9/2019 -- 17:15:42 - - enabling 'eve-log' module 'netflow' 24/9/2019 -- 17:15:42 - - stats output device (regular) initialized: stats.log 24/9/2019 -- 17:15:42 - - Delayed detect disabled 24/9/2019 -- 17:15:42 - - Running in live mode, activating unix socket 24/9/2019 -- 17:15:42 - - pattern matchers: MPM: hs, SPM: hs 24/9/2019 -- 17:15:42 - - grouping: tcp-whitelist (default) 53, 80, 139, 443, 445, 1433, 3306, 3389, 6666, 6667, 8080 24/9/2019 -- 17:15:42 - - grouping: udp-whitelist (default) 53, 135, 5060 24/9/2019 -- 17:15:42 - - prefilter engines: MPM 24/9/2019 -- 17:15:42 - - Loading reputation file: /etc/suricata/rules/scirius-iprep.list 24/9/2019 -- 17:15:42 - - host memory usage: 2268688 bytes, maximum: 33554432 24/9/2019 -- 17:15:42 - - Loading rule file: /etc/suricata/rules/scirius.rules 24/9/2019 -- 17:15:48 - - 1 rule files processed. 18918 rules successfully loaded, 0 rules failed 24/9/2019 -- 17:15:48 - - Threshold config parsed: 0 rule(s) found 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for tcp-packet 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for tcp-stream 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for udp-packet 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for other-ip 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_uri 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_request_line 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_client_body 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_response_line 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_header 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_header 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_header_names 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_header_names 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_accept 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_accept_enc 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_accept_lang 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_referer 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_connection 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_content_len 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_content_len 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_content_type 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_content_type 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_protocol 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_protocol 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_start 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_start 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_raw_header 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_raw_header 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_method 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_cookie 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_cookie 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_raw_uri 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_user_agent 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_host 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_raw_host 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_stat_msg 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for http_stat_code 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for dns_query 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for tls_sni 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for tls_cert_issuer 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for tls_cert_subject 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for tls_cert_serial 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for tls_cert_fingerprint 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for ja3_hash 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for ja3_string 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for dce_stub_data 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for dce_stub_data 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for smb_named_pipe 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for smb_share 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for ssh_protocol 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for ssh_protocol 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for ssh_software 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for ssh_software 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for file_data 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for file_data 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for file_data 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for file_data 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for krb5_cname 24/9/2019 -- 17:15:48 - - using shared mpm ctx' for krb5_sname 24/9/2019 -- 17:15:48 - - 18921 signatures processed. 10 are IP-only rules, 5044 are inspecting packet payload, 16091 inspect application layer, 0 are decoder event only 24/9/2019 -- 17:15:48 - - building signature grouping structure, stage 1: preprocessing rules... complete 24/9/2019 -- 17:15:48 - - TCP toserver: 41 port groups, 35 unique SGH's, 6 copies 24/9/2019 -- 17:15:48 - - TCP toclient: 21 port groups, 21 unique SGH's, 0 copies 24/9/2019 -- 17:15:48 - - UDP toserver: 41 port groups, 35 unique SGH's, 6 copies 24/9/2019 -- 17:15:48 - - UDP toclient: 21 port groups, 16 unique SGH's, 5 copies 24/9/2019 -- 17:15:49 - - OTHER toserver: 254 proto groups, 3 unique SGH's, 251 copies 24/9/2019 -- 17:15:49 - - OTHER toclient: 254 proto groups, 0 unique SGH's, 254 copies 24/9/2019 -- 17:15:55 - - Unique rule groups: 110 24/9/2019 -- 17:15:55 - - Builtin MPM "toserver TCP packet": 27 24/9/2019 -- 17:15:55 - - Builtin MPM "toclient TCP packet": 20 24/9/2019 -- 17:15:55 - - Builtin MPM "toserver TCP stream": 27 24/9/2019 -- 17:15:55 - - Builtin MPM "toclient TCP stream": 21 24/9/2019 -- 17:15:55 - - Builtin MPM "toserver UDP packet": 35 24/9/2019 -- 17:15:55 - - Builtin MPM "toclient UDP packet": 15 24/9/2019 -- 17:15:55 - - Builtin MPM "other IP packet": 2 24/9/2019 -- 17:15:55 - - AppLayer MPM "toserver http_uri": 12 24/9/2019 -- 17:15:55 - - AppLayer MPM "toserver http_request_line": 1 24/9/2019 -- 17:15:55 - - AppLayer MPM "toserver http_client_body": 5 24/9/2019 -- 17:15:55 - - AppLayer MPM "toclient http_response_line": 1 24/9/2019 -- 17:15:55 - - AppLayer MPM "toserver http_header": 6 24/9/2019 -- 17:15:55 - - AppLayer MPM "toclient http_header": 3 24/9/2019 -- 17:15:55 - - AppLayer MPM "toserver http_header_names": 1 24/9/2019 -- 17:15:55 - - AppLayer MPM "toserver http_accept": 1 24/9/2019 -- 17:15:55 - - AppLayer MPM "toserver http_referer": 1 24/9/2019 -- 17:15:55 - - AppLayer MPM "toserver http_content_len": 1 24/9/2019 -- 17:15:55 - - AppLayer MPM "toserver http_content_type": 1 24/9/2019 -- 17:15:55 - - AppLayer MPM "toclient http_content_type": 1 24/9/2019 -- 17:15:55 - - AppLayer MPM "toserver http_start": 1 24/9/2019 -- 17:15:55 - - AppLayer MPM "toserver http_raw_header": 1 24/9/2019 -- 17:15:55 - - AppLayer MPM "toserver http_method": 3 24/9/2019 -- 17:15:55 - - AppLayer MPM "toserver http_cookie": 1 24/9/2019 -- 17:15:55 - - AppLayer MPM "toclient http_cookie": 2 24/9/2019 -- 17:15:55 - - AppLayer MPM "toserver http_raw_uri": 1 24/9/2019 -- 17:15:55 - - AppLayer MPM "toserver http_user_agent": 4 24/9/2019 -- 17:15:55 - - AppLayer MPM "toserver http_host": 2 24/9/2019 -- 17:15:55 - - AppLayer MPM "toclient http_stat_code": 1 24/9/2019 -- 17:15:55 - - AppLayer MPM "toserver dns_query": 4 24/9/2019 -- 17:15:55 - - AppLayer MPM "toserver tls_sni": 2 24/9/2019 -- 17:15:55 - - AppLayer MPM "toclient tls_cert_issuer": 2 24/9/2019 -- 17:15:55 - - AppLayer MPM "toclient tls_cert_subject": 2 24/9/2019 -- 17:15:55 - - AppLayer MPM "toclient tls_cert_serial": 1 24/9/2019 -- 17:15:55 - - AppLayer MPM "toserver ssh_protocol": 1 24/9/2019 -- 17:15:55 - - AppLayer MPM "toserver file_data": 1 24/9/2019 -- 17:15:55 - - AppLayer MPM "toclient file_data": 5 24/9/2019 -- 17:16:06 - - ZC interface detected, not setting cluster-id for PF_RING (iface zc:ens1f0) 24/9/2019 -- 17:16:06 - - ZC interface detected, not setting cluster type for PF_RING (iface zc:ens1f0) 24/9/2019 -- 17:16:06 - - [ERRCODE: SC_ERR_SYSCALL(50)] - Failure when trying to get feature via ioctl for 'zc:ens1f0': No such device (19) 24/9/2019 -- 17:16:06 - - [ERRCODE: SC_ERR_SYSCALL(50)] - Failure when trying to get feature via ioctl for 'zc:ens1f0': No such device (19) 24/9/2019 -- 17:16:06 - - [ERRCODE: SC_ERR_SYSCALL(50)] - Failure when trying to get feature via ioctl for 'zc:ens1f0': No such device (19) 24/9/2019 -- 17:16:06 - - [ERRCODE: SC_ERR_SYSCALL(50)] - Failure when trying to get feature via ioctl for 'zc:ens1f0': No such device (19) 24/9/2019 -- 17:16:06 - - [ERRCODE: SC_ERR_SYSCALL(50)] - Failure when trying to get feature via ioctl for 'zc:ens1f0': No such device (19) 24/9/2019 -- 17:16:06 - - Going to use 1 thread(s) 24/9/2019 -- 17:16:06 - - Enabling zero-copy for zc:ens1f0 24/9/2019 -- 17:16:07 - - ZC interface detected, not adding thread to cluster 24/9/2019 -- 17:16:07 - - (W#01-zc:ens1f0) Using PF_RING v.7.5.0, interface zc:ens1f0, cluster-id 1, single-pfring-thread 24/9/2019 -- 17:16:07 - - RunModeIdsPfringWorkers initialised 24/9/2019 -- 17:16:07 - - using 1 flow manager threads 24/9/2019 -- 17:16:07 - - using 1 flow recycler threads 24/9/2019 -- 17:16:07 - - Running in live mode, activating unix socket 24/9/2019 -- 17:16:07 - - Using unix socket file '/var/run/suricata/suricata-command.socket' 24/9/2019 -- 17:16:07 - - all 1 packet processing threads, 2 management threads initialized, engine started. 24/9/2019 -- 17:16:07 - - [ERRCODE: SC_ERR_PF_RING_VLAN(304)] - no VLAN header in the raw packet. See #2355. 24/9/2019 -- 17:18:32 - - Signal Received. Stopping engine. 24/9/2019 -- 17:18:32 - - 0 new flows, 0 established flows were timed out, 0 flows in closed state 24/9/2019 -- 17:18:32 - - time elapsed 145.572s 24/9/2019 -- 17:18:32 - - 0 flows processed 24/9/2019 -- 17:18:32 - - (W#01-zc:ens1f0) Kernel: Packets 49259, dropped 0 24/9/2019 -- 17:18:32 - - (W#01-zc:ens1f0) Packets 49259, bytes 44128784 24/9/2019 -- 17:18:32 - - Alerts: 0 24/9/2019 -- 17:18:32 - - ippair memory usage: 414144 bytes, maximum: 16777216 24/9/2019 -- 17:18:32 - - host memory usage: 2268688 bytes, maximum: 33554432 24/9/2019 -- 17:18:32 - - cleaning up signature grouping structure... complete 24/9/2019 -- 17:18:32 - - Stats for 'zc:ens1f0': pkts: 49259, drop: 0 (0.00%), invalid chksum: 0 24/9/2019 -- 17:18:32 - - Cleaning up Hyperscan global scratch 24/9/2019 -- 17:18:32 - - Clearing Hyperscan database cache
  1. run suricata with out pf ring(af packet, same config) 24/9/2019 -- 17:19:26 - - This is Suricata version 4.1.4 RELEASE 24/9/2019 -- 17:19:26 - - CPUs/cores online: 4 24/9/2019 -- 17:19:26 - - luajit states preallocated: 128 24/9/2019 -- 17:19:26 - - 'default' server has 'request-body-minimal-inspect-size' set to 32165 and 'request-body-inspect-window' set to 4055 after randomization. 24/9/2019 -- 17:19:26 - - 'default' server has 'response-body-minimal-inspect-size' set to 39328 and 'response-body-inspect-window' set to 15681 after randomization. 24/9/2019 -- 17:19:26 - - SMB stream depth: 0 24/9/2019 -- 17:19:26 - - Protocol detection and parser disabled for modbus protocol. 24/9/2019 -- 17:19:26 - - Protocol detection and parser disabled for enip protocol. 24/9/2019 -- 17:19:26 - - Protocol detection and parser disabled for DNP3. 24/9/2019 -- 17:19:26 - - allocated 262144 bytes of memory for the host hash... 4096 buckets of size 64 24/9/2019 -- 17:19:26 - - preallocated 1000 hosts of size 136 24/9/2019 -- 17:19:26 - - host memory usage: 398144 bytes, maximum: 33554432 24/9/2019 -- 17:19:26 - - Max dump is 0 24/9/2019 -- 17:19:26 - - Core dump setting attempted is 0 24/9/2019 -- 17:19:26 - - Core dump size set to 0 24/9/2019 -- 17:19:26 - - allocated 3670016 bytes of memory for the defrag hash... 65536 buckets of size 56 24/9/2019 -- 17:19:26 - - preallocated 65535 defrag trackers of size 160 24/9/2019 -- 17:19:26 - - defrag memory usage: 14155616 bytes, maximum: 33554432 24/9/2019 -- 17:19:26 - - stream "prealloc-sessions": 2048 (per thread) 24/9/2019 -- 17:19:26 - - stream "memcap": 67108864 24/9/2019 -- 17:19:26 - - stream "midstream" session pickups: disabled 24/9/2019 -- 17:19:26 - - stream "async-oneside": disabled 24/9/2019 -- 17:19:26 - - stream "checksum-validation": enabled 24/9/2019 -- 17:19:26 - - stream."inline": disabled 24/9/2019 -- 17:19:26 - - stream "bypass": disabled 24/9/2019 -- 17:19:26 - - stream "max-synack-queued": 5 24/9/2019 -- 17:19:26 - - stream.reassembly "memcap": 268435456 24/9/2019 -- 17:19:26 - - stream.reassembly "depth": 1048576 24/9/2019 -- 17:19:26 - - stream.reassembly "toserver-chunk-size": 2439 24/9/2019 -- 17:19:26 - - stream.reassembly "toclient-chunk-size": 2492 24/9/2019 -- 17:19:26 - - stream.reassembly.raw: enabled 24/9/2019 -- 17:19:26 - - stream.reassembly "segment-prealloc": 2048 24/9/2019 -- 17:19:26 - - enabling 'eve-log' module 'alert' 24/9/2019 -- 17:19:26 - - enabling 'eve-log' module 'http' 24/9/2019 -- 17:19:26 - - enabling 'eve-log' module 'dns' 24/9/2019 -- 17:19:26 - - enabling 'eve-log' module 'tls' 24/9/2019 -- 17:19:26 - - enabling 'eve-log' module 'files' 24/9/2019 -- 17:19:26 - - forcing magic lookup for logged files 24/9/2019 -- 17:19:26 - - forcing sha256 calculation for logged or stored files 24/9/2019 -- 17:19:26 - - enabling 'eve-log' module 'smtp' 24/9/2019 -- 17:19:26 - - enabling 'eve-log' module 'nfs' 24/9/2019 -- 17:19:26 - - enabling 'eve-log' module 'smb' 24/9/2019 -- 17:19:26 - - enabling 'eve-log' module 'tftp' 24/9/2019 -- 17:19:26 - - enabling 'eve-log' module 'ikev2' 24/9/2019 -- 17:19:26 - - enabling 'eve-log' module 'krb5' 24/9/2019 -- 17:19:26 - - enabling 'eve-log' module 'dhcp' 24/9/2019 -- 17:19:26 - - enabling 'eve-log' module 'ssh' 24/9/2019 -- 17:19:26 - - enabling 'eve-log' module 'stats' 24/9/2019 -- 17:19:26 - - [ERRCODE: SC_WARN_EVE_MISSING_EVENTS(318)] - eve.stats will not display all decoder events correctly. See #2225. Set a prefix in stats.decoder-events-prefix. In 5.0 the prefix will default to 'decoder.event'. 24/9/2019 -- 17:19:26 - - enabling 'eve-log' module 'flow' 24/9/2019 -- 17:19:26 - - enabling 'eve-log' module 'netflow' 24/9/2019 -- 17:19:26 - - stats output device (regular) initialized: stats.log 24/9/2019 -- 17:19:26 - - Delayed detect disabled 24/9/2019 -- 17:19:26 - - Running in live mode, activating unix socket 24/9/2019 -- 17:19:26 - - pattern matchers: MPM: hs, SPM: hs 24/9/2019 -- 17:19:26 - - grouping: tcp-whitelist (default) 53, 80, 139, 443, 445, 1433, 3306, 3389, 6666, 6667, 8080 24/9/2019 -- 17:19:26 - - grouping: udp-whitelist (default) 53, 135, 5060 24/9/2019 -- 17:19:26 - - prefilter engines: MPM 24/9/2019 -- 17:19:26 - - Loading reputation file: /etc/suricata/rules/scirius-iprep.list 24/9/2019 -- 17:19:26 - - host memory usage: 2268688 bytes, maximum: 33554432 24/9/2019 -- 17:19:26 - - Loading rule file: /etc/suricata/rules/scirius.rules 24/9/2019 -- 17:19:33 - - 1 rule files processed. 18918 rules successfully loaded, 0 rules failed 24/9/2019 -- 17:19:33 - - Threshold config parsed: 0 rule(s) found 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for tcp-packet 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for tcp-stream 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for udp-packet 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for other-ip 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_uri 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_request_line 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_client_body 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_response_line 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_header 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_header 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_header_names 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_header_names 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_accept 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_accept_enc 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_accept_lang 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_referer 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_connection 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_content_len 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_content_len 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_content_type 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_content_type 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_protocol 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_protocol 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_start 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_start 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_raw_header 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_raw_header 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_method 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_cookie 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_cookie 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_raw_uri 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_user_agent 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_host 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_raw_host 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_stat_msg 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for http_stat_code 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for dns_query 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for tls_sni 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for tls_cert_issuer 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for tls_cert_subject 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for tls_cert_serial 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for tls_cert_fingerprint 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for ja3_hash 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for ja3_string 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for dce_stub_data 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for dce_stub_data 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for smb_named_pipe 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for smb_share 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for ssh_protocol 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for ssh_protocol 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for ssh_software 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for ssh_software 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for file_data 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for file_data 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for file_data 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for file_data 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for krb5_cname 24/9/2019 -- 17:19:33 - - using shared mpm ctx' for krb5_sname 24/9/2019 -- 17:19:33 - - 18921 signatures processed. 10 are IP-only rules, 5044 are inspecting packet payload, 16091 inspect application layer, 0 are decoder event only 24/9/2019 -- 17:19:33 - - building signature grouping structure, stage 1: preprocessing rules... complete 24/9/2019 -- 17:19:33 - - TCP toserver: 41 port groups, 35 unique SGH's, 6 copies 24/9/2019 -- 17:19:33 - - TCP toclient: 21 port groups, 21 unique SGH's, 0 copies 24/9/2019 -- 17:19:33 - - UDP toserver: 41 port groups, 35 unique SGH's, 6 copies 24/9/2019 -- 17:19:33 - - UDP toclient: 21 port groups, 16 unique SGH's, 5 copies 24/9/2019 -- 17:19:33 - - OTHER toserver: 254 proto groups, 3 unique SGH's, 251 copies 24/9/2019 -- 17:19:34 - - OTHER toclient: 254 proto groups, 0 unique SGH's, 254 copies 24/9/2019 -- 17:19:40 - - Unique rule groups: 110 24/9/2019 -- 17:19:40 - - Builtin MPM "toserver TCP packet": 27 24/9/2019 -- 17:19:40 - - Builtin MPM "toclient TCP packet": 20 24/9/2019 -- 17:19:40 - - Builtin MPM "toserver TCP stream": 27 24/9/2019 -- 17:19:40 - - Builtin MPM "toclient TCP stream": 21 24/9/2019 -- 17:19:40 - - Builtin MPM "toserver UDP packet": 35 24/9/2019 -- 17:19:40 - - Builtin MPM "toclient UDP packet": 15 24/9/2019 -- 17:19:40 - - Builtin MPM "other IP packet": 2 24/9/2019 -- 17:19:40 - - AppLayer MPM "toserver http_uri": 12 24/9/2019 -- 17:19:40 - - AppLayer MPM "toserver http_request_line": 1 24/9/2019 -- 17:19:40 - - AppLayer MPM "toserver http_client_body": 5 24/9/2019 -- 17:19:40 - - AppLayer MPM "toclient http_response_line": 1 24/9/2019 -- 17:19:40 - - AppLayer MPM "toserver http_header": 6 24/9/2019 -- 17:19:40 - - AppLayer MPM "toclient http_header": 3 24/9/2019 -- 17:19:40 - - AppLayer MPM "toserver http_header_names": 1 24/9/2019 -- 17:19:40 - - AppLayer MPM "toserver http_accept": 1 24/9/2019 -- 17:19:40 - - AppLayer MPM "toserver http_referer": 1 24/9/2019 -- 17:19:40 - - AppLayer MPM "toserver http_content_len": 1 24/9/2019 -- 17:19:40 - - AppLayer MPM "toserver http_content_type": 1 24/9/2019 -- 17:19:40 - - AppLayer MPM "toclient http_content_type": 1 24/9/2019 -- 17:19:40 - - AppLayer MPM "toserver http_start": 1 24/9/2019 -- 17:19:40 - - AppLayer MPM "toserver http_raw_header": 1 24/9/2019 -- 17:19:40 - - AppLayer MPM "toserver http_method": 3 24/9/2019 -- 17:19:40 - - AppLayer MPM "toserver http_cookie": 1 24/9/2019 -- 17:19:40 - - AppLayer MPM "toclient http_cookie": 2 24/9/2019 -- 17:19:40 - - AppLayer MPM "toserver http_raw_uri": 1 24/9/2019 -- 17:19:40 - - AppLayer MPM "toserver http_user_agent": 4 24/9/2019 -- 17:19:40 - - AppLayer MPM "toserver http_host": 2 24/9/2019 -- 17:19:40 - - AppLayer MPM "toclient http_stat_code": 1 24/9/2019 -- 17:19:40 - - AppLayer MPM "toserver dns_query": 4 24/9/2019 -- 17:19:40 - - AppLayer MPM "toserver tls_sni": 2 24/9/2019 -- 17:19:40 - - AppLayer MPM "toclient tls_cert_issuer": 2 24/9/2019 -- 17:19:40 - - AppLayer MPM "toclient tls_cert_subject": 2 24/9/2019 -- 17:19:40 - - AppLayer MPM "toclient tls_cert_serial": 1 24/9/2019 -- 17:19:40 - - AppLayer MPM "toserver ssh_protocol": 1 24/9/2019 -- 17:19:40 - - AppLayer MPM "toserver file_data": 1 24/9/2019 -- 17:19:40 - - AppLayer MPM "toclient file_data": 5 24/9/2019 -- 17:19:51 - - 4 cores, so using 4 threads 24/9/2019 -- 17:19:51 - - Using 4 AF_PACKET threads for interface ens1f0 24/9/2019 -- 17:19:51 - - ens1f0: enabling zero copy mode by using data release call 24/9/2019 -- 17:19:51 - - Going to use 4 thread(s) 24/9/2019 -- 17:19:51 - - using 1 flow manager threads 24/9/2019 -- 17:19:51 - - using 1 flow recycler threads 24/9/2019 -- 17:19:51 - - Running in live mode, activating unix socket 24/9/2019 -- 17:19:51 - - Using unix socket file '/var/run/suricata/suricata-command.socket' 24/9/2019 -- 17:19:51 - - all 4 packet processing threads, 2 management threads initialized, engine started. 24/9/2019 -- 17:19:51 - - AF_PACKET RX Ring params: block_size=32768 block_nr=26 frame_size=1584 frame_nr=520 24/9/2019 -- 17:19:51 - - AF_PACKET RX Ring params: block_size=32768 block_nr=26 frame_size=1584 frame_nr=520 24/9/2019 -- 17:19:51 - - AF_PACKET RX Ring params: block_size=32768 block_nr=26 frame_size=1584 frame_nr=520 24/9/2019 -- 17:19:51 - - AF_PACKET RX Ring params: block_size=32768 block_nr=26 frame_size=1584 frame_nr=520 24/9/2019 -- 17:19:51 - - All AFP capture threads are running. 24/9/2019 -- 17:19:52 - - Trying to connect to Redis 24/9/2019 -- 17:19:52 - - Connected to Redis. 24/9/2019 -- 17:22:04 - - Signal Received. Stopping engine. 24/9/2019 -- 17:22:04 - - 0 new flows, 0 established flows were timed out, 0 flows in closed state 24/9/2019 -- 17:22:04 - - time elapsed 133.340s 24/9/2019 -- 17:22:04 - - 302 flows processed 24/9/2019 -- 17:22:04 - - (W#01-ens1f0) Kernel: Packets 8525, dropped 0 24/9/2019 -- 17:22:04 - - (W#02-ens1f0) Kernel: Packets 4610, dropped 0 24/9/2019 -- 17:22:04 - - (W#03-ens1f0) Kernel: Packets 3089, dropped 0 24/9/2019 -- 17:22:04 - - (W#04-ens1f0) Kernel: Packets 29637, dropped 0 24/9/2019 -- 17:22:04 - - Alerts: 0 24/9/2019 -- 17:22:04 - - QUIT Command sent to redis. Connection will terminate! 24/9/2019 -- 17:22:04 - - Missing reply from redis, disconnected. 24/9/2019 -- 17:22:04 - - Disconnecting from redis! 24/9/2019 -- 17:22:04 - - ippair memory usage: 414144 bytes, maximum: 16777216 24/9/2019 -- 17:22:05 - - host memory usage: 2268688 bytes, maximum: 33554432 24/9/2019 -- 17:22:05 - - cleaning up signature grouping structure... complete 24/9/2019 -- 17:22:05 - - Stats for 'ens1f0': pkts: 45861, drop: 0 (0.00%), invalid chksum: 0 24/9/2019 -- 17:22:05 - - Cleaning up Hyperscan global scratch 24/9/2019 -- 17:22:05 - - Clearing Hyperscan database cache

I don't know this is pf_ring problem or a problem with Suricata. However, some of the errors that pf_ring causes, seem to skip some of the settings of the suricata.

south-devel commented 5 years ago

It work fine with vanilla pf_ring. And I found that vanilla pf_ring noticed this

24/9/2019 -- 21:56:28 - - Enabling zero-copy for ens1f0

Is this correct??? pf_ring zero-copy run with Standard Mode???(https://www.ntop.org/guides/pf_ring/thirdparty/suricata.html#standard-mode)

[root@localhost system]# PF_RING_FT_CONF=/etc/pf_ring/ft-rules.conf suricata --pfring-int=ens1f0 --pfring-cluster-id=99 --pfring-cluster-type=cluster_flow -c /etc/suricata/suricata.yaml -vvv 24/9/2019 -- 21:56:03 - - This is Suricata version 4.1.4 RELEASE 24/9/2019 -- 21:56:03 - - CPUs/cores online: 4 24/9/2019 -- 21:56:03 - - luajit states preallocated: 128 24/9/2019 -- 21:56:03 - - 'default' server has 'request-body-minimal-inspect-size' set to 32649 and 'request-body-inspect-window' set to 4229 after randomization. 24/9/2019 -- 21:56:03 - - 'default' server has 'response-body-minimal-inspect-size' set to 39307 and 'response-body-inspect-window' set to 16538 after randomization. 24/9/2019 -- 21:56:03 - - SMB stream depth: 0 24/9/2019 -- 21:56:03 - - Protocol detection and parser disabled for modbus protocol. 24/9/2019 -- 21:56:03 - - Protocol detection and parser disabled for enip protocol. 24/9/2019 -- 21:56:03 - - Protocol detection and parser disabled for DNP3. 24/9/2019 -- 21:56:03 - - allocated 262144 bytes of memory for the host hash... 4096 buckets of size 64 24/9/2019 -- 21:56:03 - - preallocated 1000 hosts of size 136 24/9/2019 -- 21:56:03 - - host memory usage: 398144 bytes, maximum: 33554432 24/9/2019 -- 21:56:03 - - Max dump is 0 24/9/2019 -- 21:56:03 - - Core dump setting attempted is 0 24/9/2019 -- 21:56:03 - - Core dump size set to 0 24/9/2019 -- 21:56:03 - - allocated 3670016 bytes of memory for the defrag hash... 65536 buckets of size 56 24/9/2019 -- 21:56:03 - - preallocated 65535 defrag trackers of size 160 24/9/2019 -- 21:56:03 - - defrag memory usage: 14155616 bytes, maximum: 33554432 24/9/2019 -- 21:56:03 - - stream "prealloc-sessions": 2048 (per thread) 24/9/2019 -- 21:56:03 - - stream "memcap": 67108864 24/9/2019 -- 21:56:03 - - stream "midstream" session pickups: disabled 24/9/2019 -- 21:56:03 - - stream "async-oneside": disabled 24/9/2019 -- 21:56:03 - - stream "checksum-validation": enabled 24/9/2019 -- 21:56:03 - - stream."inline": disabled 24/9/2019 -- 21:56:03 - - stream "bypass": disabled 24/9/2019 -- 21:56:03 - - stream "max-synack-queued": 5 24/9/2019 -- 21:56:03 - - stream.reassembly "memcap": 268435456 24/9/2019 -- 21:56:03 - - stream.reassembly "depth": 1048576 24/9/2019 -- 21:56:03 - - stream.reassembly "toserver-chunk-size": 2503 24/9/2019 -- 21:56:03 - - stream.reassembly "toclient-chunk-size": 2524 24/9/2019 -- 21:56:03 - - stream.reassembly.raw: enabled 24/9/2019 -- 21:56:03 - - stream.reassembly "segment-prealloc": 2048 24/9/2019 -- 21:56:03 - - enabling 'eve-log' module 'alert' 24/9/2019 -- 21:56:03 - - enabling 'eve-log' module 'http' 24/9/2019 -- 21:56:03 - - enabling 'eve-log' module 'dns' 24/9/2019 -- 21:56:03 - - enabling 'eve-log' module 'tls' 24/9/2019 -- 21:56:03 - - enabling 'eve-log' module 'files' 24/9/2019 -- 21:56:03 - - forcing magic lookup for logged files 24/9/2019 -- 21:56:03 - - forcing sha256 calculation for logged or stored files 24/9/2019 -- 21:56:03 - - enabling 'eve-log' module 'smtp' 24/9/2019 -- 21:56:03 - - enabling 'eve-log' module 'nfs' 24/9/2019 -- 21:56:03 - - enabling 'eve-log' module 'smb' 24/9/2019 -- 21:56:03 - - enabling 'eve-log' module 'tftp' 24/9/2019 -- 21:56:03 - - enabling 'eve-log' module 'ikev2' 24/9/2019 -- 21:56:03 - - enabling 'eve-log' module 'krb5' 24/9/2019 -- 21:56:03 - - enabling 'eve-log' module 'dhcp' 24/9/2019 -- 21:56:03 - - enabling 'eve-log' module 'ssh' 24/9/2019 -- 21:56:03 - - enabling 'eve-log' module 'stats' 24/9/2019 -- 21:56:03 - - [ERRCODE: SC_WARN_EVE_MISSING_EVENTS(318)] - eve.stats will not display all decoder events correctly. See #2225. Set a prefix in stats.decoder-events-prefix. In 5.0 the prefix will default to 'decoder.event'. 24/9/2019 -- 21:56:03 - - enabling 'eve-log' module 'flow' 24/9/2019 -- 21:56:03 - - enabling 'eve-log' module 'netflow' 24/9/2019 -- 21:56:03 - - stats output device (regular) initialized: stats.log 24/9/2019 -- 21:56:03 - - Delayed detect disabled 24/9/2019 -- 21:56:03 - - Running in live mode, activating unix socket 24/9/2019 -- 21:56:03 - - pattern matchers: MPM: hs, SPM: hs 24/9/2019 -- 21:56:03 - - grouping: tcp-whitelist (default) 53, 80, 139, 443, 445, 1433, 3306, 3389, 6666, 6667, 8080 24/9/2019 -- 21:56:03 - - grouping: udp-whitelist (default) 53, 135, 5060 24/9/2019 -- 21:56:03 - - prefilter engines: MPM 24/9/2019 -- 21:56:03 - - Loading reputation file: /etc/suricata/rules/scirius-iprep.list 24/9/2019 -- 21:56:03 - - host memory usage: 2268688 bytes, maximum: 33554432 24/9/2019 -- 21:56:03 - - Loading rule file: /etc/suricata/rules/scirius.rules 24/9/2019 -- 21:56:10 - - 1 rule files processed. 18918 rules successfully loaded, 0 rules failed 24/9/2019 -- 21:56:10 - - Threshold config parsed: 0 rule(s) found 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for tcp-packet 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for tcp-stream 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for udp-packet 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for other-ip 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_uri 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_request_line 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_client_body 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_response_line 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_header 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_header 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_header_names 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_header_names 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_accept 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_accept_enc 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_accept_lang 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_referer 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_connection 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_content_len 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_content_len 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_content_type 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_content_type 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_protocol 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_protocol 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_start 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_start 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_raw_header 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_raw_header 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_method 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_cookie 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_cookie 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_raw_uri 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_user_agent 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_host 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_raw_host 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_stat_msg 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for http_stat_code 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for dns_query 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for tls_sni 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for tls_cert_issuer 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for tls_cert_subject 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for tls_cert_serial 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for tls_cert_fingerprint 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for ja3_hash 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for ja3_string 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for dce_stub_data 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for dce_stub_data 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for smb_named_pipe 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for smb_share 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for ssh_protocol 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for ssh_protocol 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for ssh_software 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for ssh_software 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for file_data 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for file_data 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for file_data 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for file_data 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for krb5_cname 24/9/2019 -- 21:56:10 - - using shared mpm ctx' for krb5_sname 24/9/2019 -- 21:56:10 - - 18921 signatures processed. 10 are IP-only rules, 5044 are inspecting packet payload, 16091 inspect application layer, 0 are decoder event only 24/9/2019 -- 21:56:10 - - building signature grouping structure, stage 1: preprocessing rules... complete 24/9/2019 -- 21:56:10 - - TCP toserver: 41 port groups, 35 unique SGH's, 6 copies 24/9/2019 -- 21:56:10 - - TCP toclient: 21 port groups, 21 unique SGH's, 0 copies 24/9/2019 -- 21:56:10 - - UDP toserver: 41 port groups, 35 unique SGH's, 6 copies 24/9/2019 -- 21:56:10 - - UDP toclient: 21 port groups, 16 unique SGH's, 5 copies 24/9/2019 -- 21:56:10 - - OTHER toserver: 254 proto groups, 3 unique SGH's, 251 copies 24/9/2019 -- 21:56:10 - - OTHER toclient: 254 proto groups, 0 unique SGH's, 254 copies 24/9/2019 -- 21:56:16 - - Unique rule groups: 110 24/9/2019 -- 21:56:16 - - Builtin MPM "toserver TCP packet": 27 24/9/2019 -- 21:56:16 - - Builtin MPM "toclient TCP packet": 20 24/9/2019 -- 21:56:16 - - Builtin MPM "toserver TCP stream": 27 24/9/2019 -- 21:56:16 - - Builtin MPM "toclient TCP stream": 21 24/9/2019 -- 21:56:16 - - Builtin MPM "toserver UDP packet": 35 24/9/2019 -- 21:56:16 - - Builtin MPM "toclient UDP packet": 15 24/9/2019 -- 21:56:16 - - Builtin MPM "other IP packet": 2 24/9/2019 -- 21:56:16 - - AppLayer MPM "toserver http_uri": 12 24/9/2019 -- 21:56:16 - - AppLayer MPM "toserver http_request_line": 1 24/9/2019 -- 21:56:16 - - AppLayer MPM "toserver http_client_body": 5 24/9/2019 -- 21:56:16 - - AppLayer MPM "toclient http_response_line": 1 24/9/2019 -- 21:56:16 - - AppLayer MPM "toserver http_header": 6 24/9/2019 -- 21:56:16 - - AppLayer MPM "toclient http_header": 3 24/9/2019 -- 21:56:16 - - AppLayer MPM "toserver http_header_names": 1 24/9/2019 -- 21:56:16 - - AppLayer MPM "toserver http_accept": 1 24/9/2019 -- 21:56:16 - - AppLayer MPM "toserver http_referer": 1 24/9/2019 -- 21:56:16 - - AppLayer MPM "toserver http_content_len": 1 24/9/2019 -- 21:56:16 - - AppLayer MPM "toserver http_content_type": 1 24/9/2019 -- 21:56:16 - - AppLayer MPM "toclient http_content_type": 1 24/9/2019 -- 21:56:16 - - AppLayer MPM "toserver http_start": 1 24/9/2019 -- 21:56:16 - - AppLayer MPM "toserver http_raw_header": 1 24/9/2019 -- 21:56:16 - - AppLayer MPM "toserver http_method": 3 24/9/2019 -- 21:56:16 - - AppLayer MPM "toserver http_cookie": 1 24/9/2019 -- 21:56:16 - - AppLayer MPM "toclient http_cookie": 2 24/9/2019 -- 21:56:16 - - AppLayer MPM "toserver http_raw_uri": 1 24/9/2019 -- 21:56:16 - - AppLayer MPM "toserver http_user_agent": 4 24/9/2019 -- 21:56:16 - - AppLayer MPM "toserver http_host": 2 24/9/2019 -- 21:56:16 - - AppLayer MPM "toclient http_stat_code": 1 24/9/2019 -- 21:56:16 - - AppLayer MPM "toserver dns_query": 4 24/9/2019 -- 21:56:16 - - AppLayer MPM "toserver tls_sni": 2 24/9/2019 -- 21:56:16 - - AppLayer MPM "toclient tls_cert_issuer": 2 24/9/2019 -- 21:56:16 - - AppLayer MPM "toclient tls_cert_subject": 2 24/9/2019 -- 21:56:16 - - AppLayer MPM "toclient tls_cert_serial": 1 24/9/2019 -- 21:56:16 - - AppLayer MPM "toserver ssh_protocol": 1 24/9/2019 -- 21:56:16 - - AppLayer MPM "toserver file_data": 1 24/9/2019 -- 21:56:16 - - AppLayer MPM "toclient file_data": 5 24/9/2019 -- 21:56:28 - - Using flow cluster mode for PF_RING (iface ens1f0) 24/9/2019 -- 21:56:28 - - Going to use 1 thread(s) 24/9/2019 -- 21:56:28 - - Enabling zero-copy for ens1f0 24/9/2019 -- 21:56:28 - - (W#01-ens1f0) Using PF_RING v.7.5.0, interface ens1f0, cluster-id 99, single-pfring-thread 24/9/2019 -- 21:56:28 - - RunModeIdsPfringWorkers initialised 24/9/2019 -- 21:56:28 - - using 1 flow manager threads 24/9/2019 -- 21:56:28 - - using 1 flow recycler threads 24/9/2019 -- 21:56:28 - - Running in live mode, activating unix socket 24/9/2019 -- 21:56:28 - - Using unix socket file '/var/run/suricata/suricata-command.socket' 24/9/2019 -- 21:56:28 - - all 1 packet processing threads, 2 management threads initialized, engine started. 24/9/2019 -- 21:56:28 - - Trying to connect to Redis 24/9/2019 -- 21:56:28 - - Connected to Redis. ^C24/9/2019 -- 21:57:22 - - Signal Received. Stopping engine. 24/9/2019 -- 21:57:22 - - 0 new flows, 0 established flows were timed out, 0 flows in closed state 24/9/2019 -- 21:57:22 - - time elapsed 54.682s 24/9/2019 -- 21:57:22 - - 119 flows processed 24/9/2019 -- 21:57:22 - - (W#01-ens1f0) Kernel: Packets 16235, dropped 0 24/9/2019 -- 21:57:22 - - (W#01-ens1f0) Packets 16233, bytes 14937679 24/9/2019 -- 21:57:23 - - Alerts: 0 24/9/2019 -- 21:57:23 - - QUIT Command sent to redis. Connection will terminate! 24/9/2019 -- 21:57:23 - - Missing reply from redis, disconnected. 24/9/2019 -- 21:57:23 - - Disconnecting from redis! 24/9/2019 -- 21:57:23 - - ippair memory usage: 414144 bytes, maximum: 16777216 24/9/2019 -- 21:57:23 - - host memory usage: 2268688 bytes, maximum: 33554432 24/9/2019 -- 21:57:23 - - cleaning up signature grouping structure... complete 24/9/2019 -- 21:57:23 - - Stats for 'ens1f0': pkts: 16235, drop: 0 (0.00%), invalid chksum: 0 24/9/2019 -- 21:57:23 - - Cleaning up Hyperscan global scratch 24/9/2019 -- 21:57:23 - - Clearing Hyperscan database cache

south-devel commented 5 years ago

pf_ringcfg --list-interfaces command output is below.

[root@localhost src]# pf_ringcfg --list-interfaces Name: ens1f2 Driver: igb [Running ZC]
Name: enp2s0f0 Driver: tg3
Name: enp2s0f1 Driver: tg3
Name: ens1f3 Driver: igb [Running ZC]
Name: ens1f0 Driver: igb [Running ZC]
Name: ens1f1 Driver: igb [Running ZC]

cardigliano commented 3 years ago

This does not look like directly related to ZC, it looks like a suricata issue