I'm trying to run TheHive on my Arch Linux by following the step-by-step guide and I'm running the Cassandra DB and Search Index in localhost for testing. I set everything also by using some missing steps I described on https://github.com/TheHive-Project/TheHive/issues/2463 and at the end of all settings, cassandra, elasticsearch and thehive services are running (according to sudo systemctl status). Despite this, I'm not able to access to TheHive by localhost:9000 and, according to the logs, this occurs because ElasticSearch is asking for authentication:
<SNIP>
2023-04-23 02:40:36,597 [INFO] from com.datastax.driver.core.Cluster in application-akka.actor.default-dispatcher-6 [|] New Cassandra host /127.0.0.1:9042 added
2023-04-23 02:40:36,609 [INFO] from org.janusgraph.diskstorage.Backend in application-akka.actor.default-dispatcher-6 [|] Configuring index [search]
2023-04-23 02:40:36,969 [WARN] from org.janusgraph.diskstorage.es.rest.RestElasticSearchClient in application-akka.actor.default-dispatcher-6 [|] Unable to determine Elasticsearch server version. Default to SEVEN.
org.elasticsearch.client.ResponseException: method [GET], host [http://localhost:9200], URI [/], status line [HTTP/1.1 401 Unauthorized]
{"error":{"root_cause":[{"type":"security_exception","reason":"missing authentication credentials for REST request [/]","header":{"WWW-Authenticate":["Basic realm=\"security\" charset=\"UTF-8\"","ApiKey"]}}],"type":"security_exception","reason":"missing authentication credentials for REST request [/]","header":{"WWW-Authenticate":["Basic realm=\"security\" charset=\"UTF-8\"","ApiKey"]}},"status":401}
at org.elasticsearch.client.RestClient.convertResponse(RestClient.java:283)
at org.elasticsearch.client.RestClient.performRequest(RestClient.java:261)
at org.elasticsearch.client.RestClient.performRequest(RestClient.java:235)
at org.janusgraph.diskstorage.es.rest.RestElasticSearchClient.getMajorVersion(RestElasticSearchClient.java:137)
at org.janusgraph.diskstorage.es.rest.RestElasticSearchClient.<init>(RestElasticSearchClient.java:117)
at org.janusgraph.diskstorage.es.rest.RestClientSetup.getElasticSearchClient(RestClientSetup.java:107)
at org.janusgraph.diskstorage.es.rest.RestClientSetup.connect(RestClientSetup.java:75)
at org.janusgraph.diskstorage.es.ElasticSearchSetup$1.connect(ElasticSearchSetup.java:51)
at org.janusgraph.diskstorage.es.ElasticSearchIndex.interfaceConfiguration(ElasticSearchIndex.java:445)
at org.janusgraph.diskstorage.es.ElasticSearchIndex.<init>(ElasticSearchIndex.java:332)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
at org.janusgraph.util.system.ConfigurationUtil.instantiate(ConfigurationUtil.java:58)
at org.janusgraph.diskstorage.Backend.getImplementationClass(Backend.java:440)
at org.janusgraph.diskstorage.Backend.getIndexes(Backend.java:427)
at org.janusgraph.diskstorage.Backend.<init>(Backend.java:150)
at org.janusgraph.graphdb.configuration.GraphDatabaseConfiguration.getBackend(GraphDatabaseConfiguration.java:1359)
at org.janusgraph.graphdb.database.StandardJanusGraph.<init>(StandardJanusGraph.java:146)
at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:161)
at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:132)
at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:112)
at org.thp.scalligraph.janus.JanusDatabase$.$anonfun$openDatabase$3(JanusDatabase.scala:62)
at org.thp.scalligraph.utils.DelayRetry.sync(Retry.scala:78)
at org.thp.scalligraph.janus.JanusDatabase$.openDatabase(JanusDatabase.scala:62)
at org.thp.scalligraph.janus.JanusDatabaseProvider.$anonfun$get$3(JanusDatabaseProvider.scala:114)
at scala.util.Success.$anonfun$map$1(Try.scala:255)
at scala.util.Success.map(Try.scala:213)
at scala.concurrent.Future.$anonfun$map$1(Future.scala:292)
at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
at org.thp.scalligraph.ContextPropagatingDispatcher$$anon$1.$anonfun$execute$2(ContextPropagatingDisptacher.scala:57)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.thp.scalligraph.DiagnosticContext$$anon$2.withContext(ContextPropagatingDisptacher.scala:77)
at org.thp.scalligraph.ContextPropagatingDispatcher$$anon$1.$anonfun$execute$1(ContextPropagatingDisptacher.scala:57)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:49)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:48)
at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
2023-04-23 02:40:36,975 [WARN] from org.thp.scalligraph.utils.Retry in application-akka.actor.default-dispatcher-6 [|] An error occurs (java.lang.IllegalArgumentException: Could not instantiate implementation: org.janusgraph.diskstorage.es.ElasticSearchIndex), retrying (1)
<SNIP>
My /etc/thehive/application.conf is:
include "/etc/thehive/secret.conf"
db {
provider: janusgraph
janusgraph {
storage {
backend: cql
hostname: ["127.0.0.1"] # seed node ip addresses
#username: "<cassandra_username>" # login to connect to database (if configured in Cassandra)
#password: "<cassandra_passowrd"
cql {
cluster-name: thp # cluster name
keyspace: thehive # name of the keyspace
local-datacenter: datacenter1 # name of the datacenter where TheHive runs (relevant only on multi datacenter setup)
# replication-factor: 2 # number of replica
read-consistency-level: ONE
write-consistency-level: ONE
}
}
## Index configuration
index.search {
backend : elasticsearch
hostname : ["localhost"]
index-name : thehive
}
}
}
## Storage configuration
storage {
provider = localfs
localfs.location = /opt/thp/thehive/files
}
I also tried to use the following as index search as test:
Request Type
Bug
Work Environment
Problem Description
I'm trying to run TheHive on my Arch Linux by following the step-by-step guide and I'm running the Cassandra DB and Search Index in localhost for testing. I set everything also by using some missing steps I described on https://github.com/TheHive-Project/TheHive/issues/2463 and at the end of all settings, cassandra, elasticsearch and thehive services are running (according to
sudo systemctl status
). Despite this, I'm not able to access to TheHive by localhost:9000 and, according to the logs, this occurs because ElasticSearch is asking for authentication:My
/etc/thehive/application.conf
is:I also tried to use the following as index search as test:
where
hello
is a working test user I created and I can connect to ElasticSearchhttp://127.0.0.1:9200
service:By connecting to
http://127.0.0.1:9000
I get "Unable to connect" browser message.