TheHive-Project / TheHive

TheHive: a Scalable, Open Source and Free Security Incident Response Platform
https://thehive-project.org
GNU Affero General Public License v3.0
3.42k stars 623 forks source link

Remoting error unable to start the hive 3.4.0-0.1RC2 #1092

Closed VinzS0C closed 5 years ago

VinzS0C commented 5 years ago

Unable to start thehive 3.4.0-0.1RC2

Fresh install, I didn't have any issue with 3.3.1 and ES 5

Request Type

Bug

Work Environment

Question Answer
OS version (server) CentOS 7 - 3.10.0-957.el7.x86_64
OS version (client) Win 10
TheHive version / git hash 3.4.0-0.1RC2
Package Type RPM
Elasticsearch 6.8.2

Problem Description

I am unable to start my fresh install with 3.4.0-0.1RC2 and elasticsearch-6.8.2

Steps to Reproduce

  1. sudo systemctl status elasticsearch.service : OK
  2. sudo systemctl status thehive.service : Not OK

Any idea ?

Complementary information

> /var/log/thehive/application.log 2019-08-06 11:40:57,517 [INFO] from module in main - Loading model class connectors.cortex.models.ActionModel 2019-08-06 11:40:57,518 [INFO] from module in main - Loading model class models.AuditModel 2019-08-06 11:40:59,961 [INFO] from akka.event.slf4j.Slf4jLogger in application-akka.actor.default-dispatcher-4 - Slf4jLogger started 2019-08-06 11:40:59,990 [INFO] from akka.remote.Remoting in application-akka.actor.default-dispatcher-4 - Starting remoting 2019-08-06 11:41:10,011 [ERROR] from akka.remote.Remoting in application-akka.actor.default-dispatcher-5 - Remoting error: [Startup timed out. This is usually related to actor system host setting or host name resolution misconfiguration.] [ akka.remote.RemoteTransportException: Startup timed out. This is usually related to actor system host setting or host name resolution misconfiguration. at akka.remote.Remoting.akka$remote$Remoting$$notifyError(Remoting.scala:148) at akka.remote.Remoting.start(Remoting.scala:211) at akka.remote.RemoteActorRefProvider.init(RemoteActorRefProvider.scala:222) at akka.cluster.ClusterActorRefProvider.init(ClusterActorRefProvider.scala:32) at akka.actor.ActorSystemImpl.liftedTree2$1(ActorSystem.scala:874) at akka.actor.ActorSystemImpl._start$lzycompute(ActorSystem.scala:870) at akka.actor.ActorSystemImpl._start(ActorSystem.scala:870) at akka.actor.ActorSystemImpl.start(ActorSystem.scala:891) at akka.actor.ActorSystem$.apply(ActorSystem.scala:246) at play.api.libs.concurrent.ActorSystemProvider$.start(Akka.scala:205) at play.api.libs.concurrent.ActorSystemProvider$.start(Akka.scala:153) at play.api.libs.concurrent.ActorSystemProvider.get$lzycompute(Akka.scala:116) at play.api.libs.concurrent.ActorSystemProvider.get(Akka.scala:115) at play.api.libs.concurrent.ActorSystemProvider.get(Akka.scala:107) at com.google.inject.internal.ProviderInternalFactory.provision(ProviderInternalFactory.java:85) at com.google.inject.internal.BoundProviderFactory.provision(BoundProviderFactory.java:77) at com.google.inject.internal.ProviderInternalFactory.circularGet(ProviderInternalFactory.java:59) at com.google.inject.internal.BoundProviderFactory.get(BoundProviderFactory.java:61) at com.google.inject.internal.SingleFieldInjector.inject(SingleFieldInjector.java:52) at com.google.inject.internal.MembersInjectorImpl.injectMembers(MembersInjectorImpl.java:147) at com.google.inject.internal.MembersInjectorImpl.injectAndNotify(MembersInjectorImpl.java:101) at com.google.inject.internal.MembersInjectorImpl.injectMembers(MembersInjectorImpl.java:71) at com.google.inject.internal.InjectorImpl.injectMembers(InjectorImpl.java:1014) at com.google.inject.util.Providers$GuicifiedProviderWithDependencies.initialize(Providers.java:154) at com.google.inject.util.Providers$GuicifiedProviderWithDependencies$$FastClassByGuice$$2a7177aa.invoke() at com.google.inject.internal.SingleMethodInjector$1.invoke(SingleMethodInjector.java:51) at com.google.inject.internal.SingleMethodInjector.inject(SingleMethodInjector.java:85) at com.google.inject.internal.MembersInjectorImpl.injectMembers(MembersInjectorImpl.java:147) at com.google.inject.internal.MembersInjectorImpl.injectAndNotify(MembersInjectorImpl.java:101) at com.google.inject.internal.Initializer$InjectableReference.get(Initializer.java:245) at com.google.inject.internal.Initializer.injectAll(Initializer.java:140) at com.google.inject.internal.InternalInjectorCreator.injectDynamically(InternalInjectorCreator.java:176) at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:109) at com.google.inject.Guice.createInjector(Guice.java:87) at com.google.inject.Guice.createInjector(Guice.java:78) at play.api.inject.guice.GuiceBuilder.injector(GuiceInjectorBuilder.scala:200) at play.api.inject.guice.GuiceApplicationBuilder.build(GuiceApplicationBuilder.scala:154) at play.api.inject.guice.GuiceApplicationLoader.load(GuiceApplicationLoader.scala:25) at play.core.server.ProdServerStart$.start(ProdServerStart.scala:53) at play.core.server.ProdServerStart$.main(ProdServerStart.scala:27) at play.core.server.ProdServerStart.main(ProdServerStart.scala) Caused by: java.util.concurrent.TimeoutException: Futures timed out after [10000 milliseconds]

>/etc/thehive/application.conf Secret Key play.http.secret.key="XXX" search { index = the_hive uri = "http://127.0.0.1:9200/"

Passimist commented 5 years ago

ist that your entire /etc/thehive/application.conf ? Could you post the rest of it? Just XXX they keys like you already did.

VinzS0C commented 5 years ago

ist that your entire /etc/thehive/application.conf ? Could you post the rest of it? Just XXX they keys like you already did.

Thank you for your message here is my full application.conf file I didn't change anything, except the key

cat /etc/thehive/application.conf # Secret Key # The secret key is used to secure cryptographic functions. # WARNING: If you deploy your application on several servers, make sure to use the same key. play.http.secret.key="XXX"

# Elasticsearch search { ## Basic configuration # Index name. index = the_hive

# ElasticSearch instance address. uri = "http://127.0.0.1:9200/"

## Advanced configuration # Scroll keepalive. #keepalive = 1m # Scroll page size. #pagesize = 50 # Number of shards #nbshards = 5 # Number of replicas #nbreplicas = 1 # Arbitrary settings #settings { # # Maximum number of nested fields # mapping.nested_fields.limit = 100 #}

## Authentication configuration #username = "admin" #password = "admin"

## SSL configuration #search.keyStore { # path = "/path/to/keystore" # type = "JKS" # or PKCS12 # password = "keystore-password" #} #search.trustStore { # path = "/path/to/trustStore" # type = "JKS" # or PKCS12 # password = "trustStore-password" #} }

# Authentication auth { # "provider" parameter contains authentication provider. It can be multi-valued (useful for migration) # available auth types are: # services.LocalAuthSrv : passwords are stored in user entity (in Elasticsearch). No configuration is required. # ad : use ActiveDirectory to authenticate users. Configuration is under "auth.ad" key # ldap : use LDAP to authenticate users. Configuration is under "auth.ldap" key provider = [local]

# By default, basic authentication is disabled. You can enable it by setting "method.basic" to true. method.basic = true

    ad {
            \# The Windows domain name in DNS format. This parameter is required if you do not use
            \# 'serverNames' below.
            \#domainFQDN = "mydomain.local"

            \# Optionally you can specify the host names of the domain controllers instead of using 'domainFQDN
            \# above. If this parameter is not set, TheHive uses 'domainFQDN'.
    \#serverNames = [ad1.mydomain.local, ad2.mydomain.local]

            \# The Windows domain name using short format. This parameter is required.
            \#domainName = "MYDOMAIN"

            \# If 'true', use SSL to connect to the domain controller.
            \#useSSL = true
    }

    ldap {
            \# The LDAP server name or address. The port can be specified using the 'host:port'
            \# syntax. This parameter is required if you don't use 'serverNames' below.
            \#serverName = "ldap.mydomain.local:389"

            \# If you have multiple LDAP servers, use the multi-valued setting 'serverNames' instead.
    \#serverNames = [ldap1.mydomain.local, ldap2.mydomain.local]

            \# Account to use to bind to the LDAP server. This parameter is required.
            \#bindDN = "cn=thehive,ou=services,dc=mydomain,dc=local"

            \# Password of the binding account. This parameter is required.
            \#bindPW = "***secret*password***"

            \# Base DN to search users. This parameter is required.
            \#baseDN = "ou=users,dc=mydomain,dc=local"

            \# Filter to search user in the directory server. Please note that {0} is replaced
            \# by the actual user name. This parameter is required.
            \#filter = "(cn={0})"

            \# If 'true', use SSL to connect to the LDAP directory server.
            \#useSSL = true
    }

}

# Maximum time between two requests without requesting authentication session { warning = 5m inactivity = 1h }

# Max textual content length play.http.parser.maxMemoryBuffer= 1M # Max file size play.http.parser.maxDiskBuffer = 1G

# Cortex # TheHive can connect to one or multiple Cortex instances. Give each # Cortex instance a name and specify the associated URL. # # In order to use Cortex, first you need to enable the Cortex module by uncommenting the next line

#play.modules.enabled += connectors.cortex.CortexConnector

cortex { #"CORTEX-SERVER-ID" { # url = "" # key = "" # # HTTP client configuration (SSL and proxy) # ws {} #} }

# MISP # TheHive can connect to one or multiple MISP instances. Give each MISP # instance a name and specify the associated Authkey that must be used # to poll events, the case template that should be used by default when # importing events as well as the tags that must be added to cases upon # import.

# Prior to configuring the integration with a MISP instance, you must # enable the MISP connector. This will allow you to import events to # and/or export cases to the MISP instance(s).

#play.modules.enabled += connectors.misp.MispConnector

misp { # Interval between consecutive MISP event imports in hours (h) or # minutes (m). interval = 1h

#"MISP-SERVER-ID" { # # MISP connection configuration requires at least an url and a key. The key must # # be linked with a sync account on MISP. # url = "" # key = "" # # # Name of the case template in TheHive that shall be used to import # # MISP events as cases by default. # caseTemplate = "" # # # Optional tags to add to each observable imported from an event # # available on this instance. # tags = ["misp-server-id"] # # ## MISP event filters # # MISP filters is used to exclude events from the import. # # Filter criteria are: # # The number of attribute # max-attributes = 1000 # # The size of its JSON representation # max-size = 1 MiB # # The age of the last publish date # max-age = 7 days # # Organization and tags # exclusion { # organisation = ["bad organisation", "other organisations"] # tags = ["tag1", "tag2"] # } # # ## HTTP client configuration (SSL and proxy) # # Truststore to use to validate the X.509 certificate of the MISP # # instance if the default truststore is not sufficient. # # Proxy can also be used # ws { # ssl.trustManager.stores = [ { # path = /path/to/truststore.jks # } ] # proxy { # host = proxy.mydomain.org # port = 3128 # } # } # # # MISP purpose defines if this instance can be used to import events (ImportOnly), export cases (ExportOnly) or both (ImportAndExport) # # Default is ImportAndExport # purpose = ImportAndExport #} ## <-- Uncomment to complete the configuration }

crackytsi commented 5 years ago

Heyy, I'm convinced that there is a bug in 3.4RC2 preventing doing the database initialization or doing an upgrade of the database to a new Version. If possible prepare the database using 3.4RC1 and then upgrade the application to 3.4RC2, this works fine ;)

VinzS0C commented 5 years ago

Heyy, I'm convinced that there is a bug in 3.4RC2 preventing doing the database initialization or doing an upgrade of the database to a new Version. If possible prepare the database using 3.4RC1 and then upgrade the application to 3.4RC2, this works fine ;)

Hi

I have uninstalled the RC2, then install the RC1. Unfortunately I have the same issue something seems to be wrong with the hostname config ... but I can't find where,

2019-08-06 16:07:53,589 [INFO] from org.reflections.Reflections in main - Reflections took 186 ms to scan 4 urls, producing 118 keys and 1299 values 2019-08-06 16:07:53,617 [INFO] from module in main - Loading model class models.ArtifactModel 2019-08-06 16:07:53,620 [INFO] from module in main - Loading model class models.UserModel 2019-08-06 16:07:53,620 [INFO] from module in main - Loading model class models.CaseModel 2019-08-06 16:07:53,621 [INFO] from module in main - Loading model class models.AuditModel 2019-08-06 16:07:53,621 [INFO] from module in main - Loading model class connectors.cortex.models.ActionModel 2019-08-06 16:07:53,621 [INFO] from module in main - Loading model class models.AlertModel 2019-08-06 16:07:53,622 [INFO] from module in main - Loading model class models.TaskModel 2019-08-06 16:07:53,622 [INFO] from module in main - Loading model class org.elastic4play.services.DBListModel 2019-08-06 16:07:53,622 [INFO] from module in main - Loading model class models.CaseTemplateModel 2019-08-06 16:07:53,622 [INFO] from module in main - Loading model class models.LogModel 2019-08-06 16:07:53,623 [INFO] from module in main - Loading model class connectors.cortex.models.JobModel 2019-08-06 16:07:53,623 [INFO] from module in main - Loading model class connectors.cortex.models.ReportTemplateModel 2019-08-06 16:07:53,623 [INFO] from module in main - Loading model class models.DashboardModel 2019-08-06 16:07:53,623 [INFO] from module in main - Loading model class org.elastic4play.services.AttachmentModel 2019-08-06 16:07:55,718 [INFO] from akka.event.slf4j.Slf4jLogger in application-akka.actor.default-dispatcher-5 - Slf4jLogger started 2019-08-06 16:07:55,762 [INFO] from akka.remote.Remoting in application-akka.actor.default-dispatcher-5 - Starting remoting 2019-08-06 16:08:05,770 [ERROR] from akka.remote.Remoting in application-akka.actor.default-dispatcher-2 - Remoting error: [Startup timed out. This is usually related to actor system host setting or host name resolution misconfiguration.] [ akka.remote.RemoteTransportException: Startup timed out. This is usually related to actor system host setting or host name resolution misconfiguration.

crackytsi commented 5 years ago

It seems that you lack of search.uri = "http://127.0.0.1:9200" Did you read the Migration guide?

VinzS0C commented 5 years ago

It seems that you lack of search.uri = "http://127.0.0.1:9200" Did you read the Migration guide?

yes for sure I read it. but in my case, it's not a migration, but a fresh install on a new server

with the RC2, the application.conf file comes directly with the right settings with the RC1, I have to modify the file, changing host to uri.

in my case I have

# Elasticsearch search { ## Basic configuration # Index name. index = the_hive # ElasticSearch instance address. uri = "http://127.0.0.1:9200/"

VinzS0C commented 5 years ago

Issue is resolved downgrading Elasticsearch from Elasticsearch 6.8.2 to Elasticsearch 6.7.2 Config in progress with 3.4.0 RC2 and Elasticsearch 6.7.2