StamusNetworks / SELKS

A Suricata based IDS/IPS/NSM distro
https://www.stamus-networks.com/open-source/#selks
GNU General Public License v3.0
1.44k stars 284 forks source link

Cannot access to Moloch #313

Open RedSnow98 opened 3 years ago

RedSnow98 commented 3 years ago

Hi guys,

It is maybe a ridiculous question but i'm really blocked. I cannot access to Moloch, when I try : "http://127.0.0.1:8005/" I arrived in nginx authentification and the wiki does not refer to any credential for Moloch.

Someone can help me please ? Thanks. image

pevma commented 3 years ago

No problem. Thanks for trying SELKS out.
You should authenticate against https://your.selks.IP.here/ like explained in the first time setup here -
https://github.com/StamusNetworks/SELKS/wiki/First-time-setup
Was that successful ? (after which it should not as you for user / pass)

RedSnow98 commented 3 years ago

No problem. Thanks for trying SELKS out. You should authenticate against https://your.selks.IP.here/ like explained in the first time setup here - https://github.com/StamusNetworks/SELKS/wiki/First-time-setup Was that successful ? (after which it should not as you for user / pass)

Thank you for your quick reply, when I am connected on Scirius, I have an access for Scirius and Evebox but no access for moloch, cyberchef or Kibana....

pevma commented 3 years ago

Can you try https://127.0.0.1/moloch ?

RedSnow98 commented 3 years ago

Can you try https://127.0.0.1/moloch ?

When I try https://127.0.0.1/moloch this is what I see : image

pevma commented 3 years ago

Seems you did not authenticate towards/log in scirius ?
Can you also please share the output of selks-health-check_stamus?

RedSnow98 commented 3 years ago

Seems you did not authenticate towards/log in scirius ? Can you also please share the output of selks-health-check_stamus?

Yes it is exactly like that gbut I'm authenticate. Yes I can :

● suricata.service - LSB: Next Generation IDS/IPS
   Loaded: loaded (/etc/init.d/suricata; generated)
   Active: active (running) since Mon 2021-05-17 08:34:31 CEST; 1h 7min ago
     Docs: man:systemd-sysv-generator(8)
  Process: 807 ExecStart=/etc/init.d/suricata start (code=exited, status=0/SUCCESS)
    Tasks: 8 (limit: 4661)
   Memory: 185.9M
   CGroup: /system.slice/suricata.service
           └─877 /usr/bin/suricata -c /etc/suricata/suricata.yaml --pidfile /var/run/suricata.pid --af-packet -D -v --user=logstash

mai 17 08:34:31 SELKS systemd[1]: Starting LSB: Next Generation IDS/IPS...
mai 17 08:34:31 SELKS suricata[807]: Starting suricata in IDS (af-packet) mode... done.
mai 17 08:34:31 SELKS systemd[1]: Started LSB: Next Generation IDS/IPS.
● elasticsearch.service - Elasticsearch
   Loaded: loaded (/lib/systemd/system/elasticsearch.service; enabled; vendor preset: enabled)
   Active: active (running) since Mon 2021-05-17 08:35:16 CEST; 1h 6min ago
     Docs: https://www.elastic.co
 Main PID: 806 (java)
    Tasks: 114 (limit: 4661)
   Memory: 1.9G
   CGroup: /system.slice/elasticsearch.service
           ├─ 806 /usr/share/elasticsearch/jdk/bin/java -Xshare:auto -Des.networkaddress.cache.ttl=60 -Des.networkaddress.cache.negative.ttl=10 -XX:+AlwaysPreTouch -Xss1m -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djna.nosys=true -XX:-OmitStackTraceInFastThrow -XX:+ShowCodeDetailsInExceptionMessages -Dio.netty.noUnsafe=true -Dio.netty.noKeySetOptimization=true -Dio.netty.recycler.maxCapacityPerThread=0 -Dio.netty.allocator.numDirectArenas=0 -Dlog4j.shutdownHookEnabled=false -Dlog4j2.disable.jmx=true -Djava.locale.providers=SPI,COMPAT --add-opens=java.base/java.io=ALL-UNNAMED -XX:+UseG1GC -Djava.io.tmpdir=/tmp/elasticsearch-4207572800159599454 -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/var/lib/elasticsearch -XX:ErrorFile=/var/log/elasticsearch/hs_err_pid%p.log -Xlog:gc*,gc+age=trace,safepoint:file=/var/log/elasticsearch/gc.log:utctime,pid,tags:filecount=32,filesize=64m -Xms1961m -Xmx1961m -XX:MaxDirectMemorySize=1028653056 -XX:G1HeapRegionSize=4m -XX:InitiatingHeapOccupancyPercent=30 -XX:G1ReservePercent=15 -Des.path.home=/usr/share/elasticsearch -Des.path.conf=/etc/elasticsearch -Des.distribution.flavor=default -Des.distribution.type=deb -Des.bundled_jdk=true -cp /usr/share/elasticsearch/lib/* org.elasticsearch.bootstrap.Elasticsearch -p /var/run/elasticsearch/elasticsearch.pid --quiet
           └─1341 /usr/share/elasticsearch/modules/x-pack-ml/platform/linux-x86_64/bin/controller

mai 17 08:34:31 SELKS systemd[1]: Starting Elasticsearch...
mai 17 08:35:16 SELKS systemd[1]: Started Elasticsearch.
● logstash.service - logstash
   Loaded: loaded (/etc/systemd/system/logstash.service; enabled; vendor preset: enabled)
   Active: active (running) since Mon 2021-05-17 08:34:30 CEST; 1h 7min ago
 Main PID: 517 (java)
    Tasks: 36 (limit: 4661)
   Memory: 435.2M
   CGroup: /system.slice/logstash.service
           └─517 /usr/share/logstash/jdk/bin/java -Xms1g -Xmx1g -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djruby.compile.invokedynamic=true -Djruby.jit.threshold=0 -Djruby.regexp.interruptible=true -XX:+HeapDumpOnOutOfMemoryError -Djava.security.egd=file:/dev/urandom -Dlog4j2.isThreadContextMapInheritable=true -cp /usr/share/logstash/logstash-core/lib/jars/animal-sniffer-annotations-1.14.jar:/usr/share/logstash/logstash-core/lib/jars/checker-compat-qual-2.0.0.jar:/usr/share/logstash/logstash-core/lib/jars/commons-codec-1.14.jar:/usr/share/logstash/logstash-core/lib/jars/commons-compiler-3.1.0.jar:/usr/share/logstash/logstash-core/lib/jars/commons-logging-1.2.jar:/usr/share/logstash/logstash-core/lib/jars/error_prone_annotations-2.1.3.jar:/usr/share/logstash/logstash-core/lib/jars/google-java-format-1.1.jar:/usr/share/logstash/logstash-core/lib/jars/gradle-license-report-0.7.1.jar:/usr/share/logstash/logstash-core/lib/jars/guava-24.1.1-jre.jar:/usr/share/logstash/logstash-core/lib/jars/j2objc-annotations-1.1.jar:/usr/share/logstash/logstash-core/lib/jars/jackson-annotations-2.9.10.jar:/usr/share/logstash/logstash-core/lib/jars/jackson-core-2.9.10.jar:/usr/share/logstash/logstash-core/lib/jars/jackson-databind-2.9.10.8.jar:/usr/share/logstash/logstash-core/lib/jars/jackson-dataformat-cbor-2.9.10.jar:/usr/share/logstash/logstash-core/lib/jars/janino-3.1.0.jar:/usr/share/logstash/logstash-core/lib/jars/javassist-3.26.0-GA.jar:/usr/share/logstash/logstash-core/lib/jars/jruby-complete-9.2.13.0.jar:/usr/share/logstash/logstash-core/lib/jars/jsr305-1.3.9.jar:/usr/share/logstash/logstash-core/lib/jars/log4j-api-2.13.3.jar:/usr/share/logstash/logstash-core/lib/jars/log4j-core-2.13.3.jar:/usr/share/logstash/logstash-core/lib/jars/log4j-jcl-2.13.3.jar:/usr/share/logstash/logstash-core/lib/jars/log4j-slf4j-impl-2.13.3.jar:/usr/share/logstash/logstash-core/lib/jars/logstash-core.jar:/usr/share/logstash/logstash-core/lib/jars/org.eclipse.core.commands-3.6.0.jar:/usr/share/logstash/logstash-core/lib/jars/org.eclipse.core.contenttype-3.4.100.jar:/usr/share/logstash/logstash-core/lib/jars/org.eclipse.core.expressions-3.4.300.jar:/usr/share/logstash/logstash-core/lib/jars/org.eclipse.core.filesystem-1.3.100.jar:/usr/share/logstash/logstash-core/lib/jars/org.eclipse.core.jobs-3.5.100.jar:/usr/share/logstash/logstash-core/lib/jars/org.eclipse.core.resources-3.7.100.jar:/usr/share/logstash/logstash-core/lib/jars/org.eclipse.core.runtime-3.7.0.jar:/usr/share/logstash/logstash-core/lib/jars/org.eclipse.equinox.app-1.3.100.jar:/usr/share/logstash/logstash-core/lib/jars/org.eclipse.equinox.common-3.6.0.jar:/usr/share/logstash/logstash-core/lib/jars/org.eclipse.equinox.preferences-3.4.1.jar:/usr/share/logstash/logstash-core/lib/jars/org.eclipse.equinox.registry-3.5.101.jar:/usr/share/logstash/logstash-core/lib/jars/org.eclipse.jdt.core-3.10.0.jar:/usr/share/logstash/logstash-core/lib/jars/org.eclipse.osgi-3.7.1.jar:/usr/share/logstash/logstash-core/lib/jars/org.eclipse.text-3.5.101.jar:/usr/share/logstash/logstash-core/lib/jars/reflections-0.9.11.jar:/usr/share/logstash/logstash-core/lib/jars/slf4j-api-1.7.25.jar org.logstash.Logstash --path.settings /etc/logstash

mai 17 08:36:33 SELKS logstash[517]: [2021-05-17T08:36:33,518][INFO ][logstash.outputs.elasticsearch][main] Installing elasticsearch template to _template/logstash
mai 17 08:36:33 SELKS logstash[517]: [2021-05-17T08:36:33,857][INFO ][logstash.filters.geoip   ][main] Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-6.0.5-java/vendor/GeoLite2-City.mmdb"}
mai 17 08:36:34 SELKS logstash[517]: [2021-05-17T08:36:34,123][INFO ][logstash.filters.geoip   ][main] Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-6.0.5-java/vendor/GeoLite2-City.mmdb"}
mai 17 08:36:34 SELKS logstash[517]: [2021-05-17T08:36:34,226][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, "pipeline.sources"=>["/etc/logstash/conf.d/logstash.conf", "/etc/logstash/conf.d/scirius-logstash.conf"], :thread=>"#<Thread:0x1f30b118 run>"}
mai 17 08:36:36 SELKS logstash[517]: [2021-05-17T08:36:36,177][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>1.95}
mai 17 08:36:36 SELKS logstash[517]: [2021-05-17T08:36:36,507][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
mai 17 08:36:36 SELKS logstash[517]: [2021-05-17T08:36:36,573][INFO ][filewatch.observingtail  ][main][d4aef1d642dafd3cc0ec28e9e79530daa4bc5c58ba6b725806ceff6c4cfb1cf0] START, creating Discoverer, Watch with file and sincedb collections
mai 17 08:36:36 SELKS logstash[517]: [2021-05-17T08:36:36,608][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
mai 17 08:36:36 SELKS logstash[517]: [2021-05-17T08:36:36,938][ERROR][logstash.codecs.json     ][main][d4aef1d642dafd3cc0ec28e9e79530daa4bc5c58ba6b725806ceff6c4cfb1cf0] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Unrecognized token 'mbly_segment_before_base_seq_delta': was expecting ('true', 'false' or 'null')
mai 17 08:36:36 SELKS logstash[517]:  at [Source: (String)"mbly_segment_before_base_seq_delta":0,"reassembly_no_segment":0,"reassembly_no_segment_delta":0,"reassembly_seq_gap":0,"reassembly_seq_gap_delta":0,"reassembly_overlap_different_data":0,"reassembly_overlap_different_data_delta":0},"flow_bypassed":{"local_pkts":0,"local_pkts_delta":0,"local_bytes":0,"local_bytes_delta":0,"local_capture_pkts":0,"local_capture_pkts_delta":0,"local_capture_bytes":0,"local_capture_bytes_delta":0,"closed":0,"closed_delta":0,"pkts":0,"pkts_delta":0,"bytes":0,"bytes_del"[truncated 2973 chars]; line: 1, column: 35]>, :data=>"mbly_segment_before_base_seq_delta\":0,\"reassembly_no_segment\":0,\"reassembly_no_segment_delta\":0,\"reassembly_seq_gap\":0,\"reassembly_seq_gap_delta\":0,\"reassembly_overlap_different_data\":0,\"reassembly_overlap_different_data_delta\":0},\"flow_bypassed\":{\"local_pkts\":0,\"local_pkts_delta\":0,\"local_bytes\":0,\"local_bytes_delta\":0,\"local_capture_pkts\":0,\"local_capture_pkts_delta\":0,\"local_capture_bytes\":0,\"local_capture_bytes_delta\":0,\"closed\":0,\"closed_delta\":0,\"pkts\":0,\"pkts_delta\":0,\"bytes\":0,\"bytes_delta\":0},\"tcp\":{\"sessions\":0,\"sessions_delta\":0,\"ssn_memcap_drop\":0,\"ssn_memcap_drop_delta\":0,\"pseudo\":0,\"pseudo_delta\":0,\"pseudo_failed\":0,\"pseudo_failed_delta\":0,\"invalid_checksum\":0,\"invalid_checksum_delta\":0,\"no_flow\":0,\"no_flow_delta\":0,\"syn\":0,\"syn_delta\":0,\"synack\":0,\"synack_delta\":0,\"rst\":0,\"rst_delta\":0,\"midstream_pickups\":0,\"midstream_pickups_delta\":0,\"pkt_on_wrong_thread\":0,\"pkt_on_wrong_thread_delta\":0,\"segment_memcap_drop\":0,\"segment_memcap_drop_delta\":0,\"stream_depth_reached\":0,\"stream_depth_reached_delta\":0,\"reassembly_gap\":0,\"reassembly_gap_delta\":0,\"overlap\":0,\"overlap_delta\":0,\"overlap_diff_data\":0,\"overlap_diff_data_delta\":0,\"insert_data_normal_fail\":0,\"insert_data_normal_fail_delta\":0,\"insert_data_overlap_fail\":0,\"insert_data_overlap_fail_delta\":0,\"insert_list_fail\":0,\"insert_list_fail_delta\":0,\"memuse\":1146880,\"memuse_delta\":0,\"reassembly_memuse\":196608,\"reassembly_memuse_delta\":0},\"detect\":{\"engines\":[{\"id\":0,\"last_reload\":\"2021-05-17T08:36:04.667719+0200\",\"rules_loaded\":22086,\"rules_failed\":1}],\"alert\":0,\"alert_delta\":0},\"app_layer\":{\"flow\":{\"http\":0,\"http_delta\":0,\"ftp\":0,\"ftp_delta\":0,\"smtp\":0,\"smtp_delta\":0,\"tls\":0,\"tls_delta\":0,\"ssh\":0,\"ssh_delta\":0,\"imap\":0,\"imap_delta\":0,\"smb\":0,\"smb_delta\":0,\"dcerpc_tcp\":0,\"dcerpc_tcp_delta\":0,\"dns_tcp\":0,\"dns_tcp_delta\":0,\"modbus\":0,\"modbus_delta\":0,\"enip_tcp\":0,\"enip_tcp_delta\":0,\"dnp3\":0,\"dnp3_delta\":0,\"nfs_tcp\":0,\"nfs_tcp_delta\":0,\"ntp\":0,\"ntp_delta\":0,\"ftp-data\":0,\"ftp-data_delta\":0,\"tftp\":0,\"tftp_delta\":0,\"ike\":0,\"ike_delta\":0,\"krb5_tcp\":0,\"krb5_tcp_delta\":0,\"dhcp\":0,\"dhcp_delta\":0,\"snmp\":0,\"snmp_delta\":0,\"sip\":0,\"sip_delta\":0,\"rfb\":0,\"rfb_delta\":0,\"mqtt\":0,\"mqtt_delta\":0,\"rdp\":0,\"rdp_delta\":0,\"http2\":0,\"http2_delta\":0,\"failed_tcp\":0,\"failed_tcp_delta\":0,\"dcerpc_udp\":0,\"dcerpc_udp_delta\":0,\"dns_udp\":2,\"dns_udp_delta\":2,\"enip_udp\":0,\"enip_udp_delta\":0,\"nfs_udp\":0,\"nfs_udp_delta\":0,\"krb5_udp\":0,\"krb5_udp_delta\":0,\"failed_udp\":3,\"failed_udp_delta\":3},\"tx\":{\"http\":0,\"http_delta\":0,\"ftp\":0,\"ftp_delta\":0,\"smtp\":0,\"smtp_delta\":0,\"tls\":0,\"tls_delta\":0,\"ssh\":0,\"ssh_delta\":0,\"imap\":0,\"imap_delta\":0,\"smb\":0,\"smb_delta\":0,\"dcerpc_tcp\":0,\"dcerpc_tcp_delta\":0,\"dns_tcp\":0,\"dns_tcp_delta\":0,\"modbus\":0,\"modbus_delta\":0,\"enip_tcp\":0,\"enip_tcp_delta\":0,\"dnp3\":0,\"dnp3_delta\":0,\"nfs_tcp\":0,\"nfs_tcp_delta\":0,\"ntp\":0,\"ntp_delta\":0,\"ftp-data\":0,\"ftp-data_delta\":0,\"tftp\":0,\"tftp_delta\":0,\"ike\":0,\"ike_delta\":0,\"krb5_tcp\":0,\"krb5_tcp_delta\":0,\"dhcp\":0,\"dhcp_delta\":0,\"snmp\":0,\"snmp_delta\":0,\"sip\":0,\"sip_delta\":0,\"rfb\":0,\"rfb_delta\":0,\"mqtt\":0,\"mqtt_delta\":0,\"rdp\":0,\"rdp_delta\":0,\"http2\":0,\"http2_delta\":0,\"dcerpc_udp\":0,\"dcerpc_udp_delta\":0,\"dns_udp\":7,\"dns_udp_delta\":7,\"enip_udp\":0,\"enip_udp_delta\":0,\"nfs_udp\":0,\"nfs_udp_delta\":0,\"krb5_udp\":0,\"krb5_udp_delta\":0},\"expectations\":0,\"expectations_delta\":0},\"http\":{\"memuse\":0,\"memuse_delta\":0,\"memcap\":0,\"memcap_delta\":0},\"ftp\":{\"memuse\":0,\"memuse_delta\":0,\"memcap\":0,\"memcap_delta\":0},\"file_store\":{\"open_files\":0,\"open_files_delta\":0}}}"}
● kibana.service - Kibana
   Loaded: loaded (/etc/systemd/system/kibana.service; enabled; vendor preset: enabled)
   Active: active (running) since Mon 2021-05-17 08:34:31 CEST; 1h 7min ago
     Docs: https://www.elastic.co
 Main PID: 814 (node)
    Tasks: 11 (limit: 4661)
   Memory: 200.1M
   CGroup: /system.slice/kibana.service
           └─814 /usr/share/kibana/bin/../node/bin/node /usr/share/kibana/bin/../src/cli/dist --logging.dest=/var/log/kibana/kibana.log --pid.file=/run/kibana/kibana.pid

mai 17 08:34:31 SELKS systemd[1]: Started Kibana.
● evebox.service - EveBox Server
   Loaded: loaded (/lib/systemd/system/evebox.service; enabled; vendor preset: enabled)
   Active: active (running) since Mon 2021-05-17 08:34:30 CEST; 1h 7min ago
 Main PID: 518 (evebox)
    Tasks: 3 (limit: 4661)
   Memory: 908.0K
   CGroup: /system.slice/evebox.service
           └─518 /usr/bin/evebox server

mai 17 08:34:30 SELKS systemd[1]: Started EveBox Server.
mai 17 08:34:30 SELKS evebox[518]: 2021-05-17 08:34:30  INFO evebox::version: This is EveBox version 0.13.1 (rev: 0dbcb12); x86_64-unknown-linux-musl
mai 17 08:34:30 SELKS evebox[518]: 2021-05-17 08:34:30  INFO evebox::server::main: Using temporary in-memory configuration database
mai 17 08:34:30 SELKS evebox[518]: 2021-05-17 08:34:30  INFO evebox::sqlite::init: Found event database schema version -1
mai 17 08:34:30 SELKS evebox[518]: 2021-05-17 08:34:30  INFO evebox::sqlite::init: Initializing SQLite database (configdb)
mai 17 08:34:30 SELKS evebox[518]: 2021-05-17 08:34:30  INFO evebox::sqlite::init: Updating SQLite database to schema version 1 (configdb)
mai 17 08:34:30 SELKS evebox[518]: 2021-05-17 08:34:30 ERROR evebox::server::main: Failed to get Elasticsearch version, things may not work right: error=request: error sending request for url (http://localhost:9200/): error trying to connect: tcp connect error: Connection refused (os error 111)
mai 17 08:34:30 SELKS evebox[518]: 2021-05-17 08:34:30  INFO evebox::server::main: Starting server on 127.0.0.1:5636, tls=false
● molochviewer-selks.service - Moloch Viewer
   Loaded: loaded (/etc/systemd/system/molochviewer-selks.service; enabled; vendor preset: enabled)
   Active: active (running) since Mon 2021-05-17 08:36:08 CEST; 1h 5min ago
 Main PID: 1764 (sh)
    Tasks: 12 (limit: 4661)
   Memory: 32.3M
   CGroup: /system.slice/molochviewer-selks.service
           ├─1764 /bin/sh -c /data/moloch/bin/node viewer.js -c /data/moloch/etc/config.ini >> /data/moloch/logs/viewer.log 2>&1
           └─1765 /data/moloch/bin/node viewer.js -c /data/moloch/etc/config.ini

mai 17 08:36:08 SELKS systemd[1]: molochviewer-selks.service: Service RestartSec=1min 30s expired, scheduling restart.
mai 17 08:36:08 SELKS systemd[1]: molochviewer-selks.service: Scheduled restart job, restart counter is at 1.
mai 17 08:36:08 SELKS systemd[1]: Stopped Moloch Viewer.
mai 17 08:36:08 SELKS systemd[1]: Started Moloch Viewer.
● molochpcapread-selks.service - Moloch Pcap Read
   Loaded: loaded (/etc/systemd/system/molochpcapread-selks.service; enabled; vendor preset: enabled)
   Active: active (running) since Mon 2021-05-17 08:36:02 CEST; 1h 5min ago
 Main PID: 1719 (sh)
    Tasks: 6 (limit: 4661)
   Memory: 29.3M
   CGroup: /system.slice/molochpcapread-selks.service
           ├─1719 /bin/sh -c /data/moloch/bin/moloch-capture -c /data/moloch/etc/config.ini -m --copy --delete -R /data/nsm/  >> /data/moloch/logs/capture.log 2>&1
           └─1720 /data/moloch/bin/moloch-capture -c /data/moloch/etc/config.ini -m --copy --delete -R /data/nsm/

mai 17 08:36:02 SELKS systemd[1]: Started Moloch Pcap Read.
scirius                          RUNNING   pid 925, uptime 1:07:05
ii  elasticsearch                         7.12.1                                  amd64        Distributed RESTful search engine built for the cloud
ii  elasticsearch-curator                 5.8.4                                   amd64        Have indices in Elasticsearch? This is the tool for you!\n\nLike a museum curator manages the exhibits and collections on display, \nElasticsearch Curator helps you curate, or manage your indices.
ii  evebox                                1:0.13.1                                amd64        no description given
ii  kibana                                7.12.1                                  amd64        Explore and visualize your Elasticsearch data
ii  kibana-dashboards-stamus              2020122001                              amd64        Kibana 6 dashboard templates.
ii  logstash                              1:7.12.1-1                              amd64        An extensible logging pipeline
ii  moloch                                2.7.1-1                                 amd64        Moloch Full Packet System
ii  scirius                               3.7.0-1                                 amd64        Django application to manage Suricata ruleset
ii  suricata                              1:2021050601-0stamus0                   amd64        Suricata open source multi-thread IDS/IPS/NSM system.
Sys. de fichiers Type     Taille Utilisé Dispo Uti% Monté sur
udev             devtmpfs   1,9G       0  1,9G   0% /dev
tmpfs            tmpfs      393M    6,3M  387M   2% /run
/dev/sda1        ext4        59G     15G   42G  26% /
tmpfs            tmpfs      2,0G     39M  1,9G   2% /dev/shm
tmpfs            tmpfs      5,0M    4,0K  5,0M   1% /run/lock
tmpfs            tmpfs      2,0G       0  2,0G   0% /sys/fs/cgroup
tmpfs            tmpfs      393M     12K  393M   1% /run/user/1000
pevma commented 3 years ago

What happens when you go to https://127.0.0.1/ ?

RedSnow98 commented 3 years ago

What happens when you go to https://127.0.0.1/ ?

it show me something like that so I think I try to access to Moloch with a bad way... Any solution ? image

pevma commented 3 years ago

Can you try accessing it from a different browser ? OR clear browser cache.