Starting the service using " bin/thehive -Dconfig.file=/etc/thehive/application.conf "
Observe the following error in the logs:
21:21:38.879 [main] INFO ScalligraphApplication - Loading application ...
[error] a.a.OneForOneStrategy - Unable to provision, see the following errors:
1) Error injecting constructor, java.lang.IllegalArgumentException: Could not find implementation class: org.janusgraph.diskstorage.inmemory.InMemoryStoreManager
at org.thp.scalligraph.janus.JanusDatabase.<init>(JanusDatabase.scala:76)
at org.thp.scalligraph.janus.JanusDatabase.class(JanusDatabase.scala:61)
while locating org.thp.scalligraph.janus.JanusDatabase
while locating org.thp.scalligraph.models.Database
for the 2nd parameter of org.thp.thehive.models.DatabaseProvider.<init>(SchemaUpdaterActor.scala:19)
at org.thp.thehive.models.DatabaseProvider.class(SchemaUpdaterActor.scala:18)
while locating org.thp.thehive.models.DatabaseProvider
while locating org.thp.scalligraph.models.Database annotated with @com.google.inject.name.Named(value=with-thehive-schema)
for the 4th parameter of org.thp.thehive.services.AuditSrv.<init>(AuditSrv.scala:29)
at org.thp.thehive.services.AuditSrv.class(AuditSrv.scala:28)
while locating org.thp.thehive.services.AuditSrv
for the 3rd parameter of org.thp.thehive.services.notification.NotificationActor.<init>(NotificationActor.scala:78)
while locating org.thp.thehive.services.notification.NotificationActor
1 error
akka.actor.ActorInitializationException: akka://application/user/notification-actor: exception during creation
at akka.actor.ActorInitializationException$.apply(Actor.scala:196)
at akka.actor.ActorCell.create(ActorCell.scala:661)
at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:513)
at akka.actor.ActorCell.systemInvoke(ActorCell.scala:535)
at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:295)
at akka.dispatch.Mailbox.run(Mailbox.scala:230)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
Caused by: com.google.inject.ProvisionException: Unable to provision, see the following errors:
1) Error injecting constructor, java.lang.IllegalArgumentException: Could not find implementation class: org.janusgraph.diskstorage.inmemory.InMemoryStoreManager
at org.thp.scalligraph.janus.JanusDatabase.<init>(JanusDatabase.scala:76)
at org.thp.scalligraph.janus.JanusDatabase.class(JanusDatabase.scala:61)
while locating org.thp.scalligraph.janus.JanusDatabase
while locating org.thp.scalligraph.models.Database
for the 2nd parameter of org.thp.thehive.models.DatabaseProvider.<init>(SchemaUpdaterActor.scala:19)
at org.thp.thehive.models.DatabaseProvider.class(SchemaUpdaterActor.scala:18)
while locating org.thp.thehive.models.DatabaseProvider
while locating org.thp.scalligraph.models.Database annotated with @com.google.inject.name.Named(value=with-thehive-schema)
for the 4th parameter of org.thp.thehive.services.AuditSrv.<init>(AuditSrv.scala:29)
at org.thp.thehive.services.AuditSrv.class(AuditSrv.scala:28)
while locating org.thp.thehive.services.AuditSrv
for the 3rd parameter of org.thp.thehive.services.notification.NotificationActor.<init>(NotificationActor.scala:78)
while locating org.thp.thehive.services.notification.NotificationActor
1 error
at com.google.inject.internal.InternalProvisionException.toProvisionException(InternalProvisionException.java:226)
at com.google.inject.internal.InjectorImpl$1.get(InjectorImpl.java:1097)
at com.google.inject.internal.InjectorImpl.getInstance(InjectorImpl.java:1131)
at play.api.inject.guice.GuiceInjector.instanceOf(GuiceInjectorBuilder.scala:436)
at play.api.inject.guice.GuiceInjector.instanceOf(GuiceInjectorBuilder.scala:431)
at play.api.inject.ContextClassLoaderInjector.$anonfun$instanceOf$2(Injector.scala:119)
at play.api.inject.ContextClassLoaderInjector.withContext(Injector.scala:128)
at play.api.inject.ContextClassLoaderInjector.instanceOf(Injector.scala:119)
at play.api.libs.concurrent.ActorRefProvider.$anonfun$get$1(Akka.scala:281)
at akka.actor.TypedCreatorFunctionConsumer.produce(IndirectActorProducer.scala:91)
Caused by: java.lang.IllegalArgumentException: Could not find implementation class: org.janusgraph.diskstorage.inmemory.InMemoryStoreManager
at org.janusgraph.util.system.ConfigurationUtil.instantiate(ConfigurationUtil.java:60)
at org.janusgraph.diskstorage.Backend.getImplementationClass(Backend.java:440)
at org.janusgraph.diskstorage.Backend.getStorageManager(Backend.java:411)
at org.janusgraph.graphdb.configuration.builder.GraphDatabaseConfigurationBuilder.build(GraphDatabaseConfigurationBuilder.java:50)
at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:161)
at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:132)
at org.janusgraph.core.JanusGraphFactory.open(JanusGraphFactory.java:112)
at org.thp.scalligraph.janus.JanusDatabase$.openDatabase(JanusDatabase.scala:56)
at org.thp.scalligraph.janus.JanusDatabase.<init>(JanusDatabase.scala:77)
at org.thp.scalligraph.janus.JanusDatabase$$FastClassByGuice$$113881e3.newInstance(<generated>)
Caused by: java.lang.ClassNotFoundException: org.janusgraph.diskstorage.inmemory.InMemoryStoreManager
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at org.janusgraph.util.system.ConfigurationUtil.instantiate(ConfigurationUtil.java:56)
at org.janusgraph.diskstorage.Backend.getImplementationClass(Backend.java:440)
at org.janusgraph.diskstorage.Backend.getStorageManager(Backend.java:411)
[...]
Complementary information
ElasticSearch is running and Cassandra not installed because i want to use ElasticSearch.
The output fro the cmd # java -version
openjdk version "1.8.0_275"
OpenJDK Runtime Environment (build 1.8.0_275-b01)
OpenJDK 64-Bit Server VM (build 25.275-b01, mixed mode)
PS : Thehive and ElasticSearch are in the same server, and for installation of the hive i use the RPM package #yum install thehive4.
Bellow the "application.conf" for thehive4
// Documentation is available at https://github.com/TheHive-Project/TheHiveDocs/TheHive4
// Include Play secret key
// More information on secret key at https://www.playframework.com/documentation/2.8.x/ApplicationSecret
include "/etc/thehive/secret.conf"
// Database configuration
//db.janusgraph
// Elasticsearch
search {
// Basic configuration
// Index name.
index = the_hive
// ElasticSearch instance address.
uri = "http://127.0.0.1:9200/"
// Scroll keepalive
keepalive = 1m
// Size of the page for scroll
pagesize = 50
// Number of shards
nbshards = 5
// Number of replicas
nbreplicas = 1
// Arbitrary settings
settings {
// Maximum number of nested fields
mapping.nested_fields.limit = 100
}
// Authentication configuration
//search.username = ""
//search.password = ""
// SSL configuration
//search.keyStore {
// path = "/path/to/keystore"
// type = "JKS" # or PKCS12
// password = "keystore-password"
//}
//search.trustStore {
// path = "/path/to/trustStore"
// type = "JKS" # or PKCS12
// password = "trustStore-password"
//}
}
//storage {
// Cassandra configuration
// More information at https://docs.janusgraph.org/basics/configuration-reference/#storagecql
//backend: cql
// hostname: ["ip1", "ip2"]
// hostname: ["ip1", "ip2"]
// Cassandra authentication (if configured)
// username: "thehive"
// password: "password"
//cql {
//cluster-name: thp
//keyspace: thehive
// }
//}
// For test only !
// Comment Cassandra settings before enable Berkeley database
// storage.backend: berkeleyje
// storage.directory: /path/to/berkeleydb
// berkeleyje.freeDisk: 200 # disk usage threshold
}
// Attachment storage configuration
storage {
// Local filesystem
// provider: localfs
// localfs.location: /path/to/files
// Hadoop filesystem (HDFS)
// provider: hdfs
// hdfs {
// root: "hdfs://localhost:10000" # namenode server hostname
// location: "/thehive" # location inside HDFS
// username: thehive # file owner
// }
// Datastore
datastore {
name = data
// Size of stored data chunks
chunksize = 50k
hash {
// Main hash algorithm /!\ Don't change this value
main = "SHA-256"
// Additional hash algorithms (used in attachments)
extra = ["SHA-1", "MD5"]
}
attachment.password = "malware"
}
}
//Authentication configuration
//More information at https://github.com/TheHive-Project/TheHiveDocs/TheHive4/Administration/Authentication.md
auth {
providers: [
{name: session} # required !
{name: basic, realm: thehive}
{name: local}
{name: key}
]
//The format of logins must be valid email address format. If the provided login doesn't contain `@` the following
//domain is automatically appended
defaultUserDomain: "thehive.local"
}
//CORTEX configuration
//More information at https://github.com/TheHive-Project/TheHiveDocs/TheHive4/Administration/Connectors.md
//Enable Cortex connector
// play.modules.enabled += org.thp.thehive.connector.cortex.CortexModule
// cortex {
// servers: [
// {
// name: "local" # Cortex name
// url: "http://localhost:9001" # URL of Cortex instance
// auth {
// type: "bearer"
// key: "***" # Cortex API key
// }
// wsConfig {} # HTTP client configuration (SSL and proxy)
// }
// ]
// }
//MISP configuration
//More information at https://github.com/TheHive-Project/TheHiveDocs/TheHive4/Administration/Connectors.md
//Enable MISP connector
// play.modules.enabled += org.thp.thehive.connector.misp.MispModule
// misp {
// interval: 1 hour
// servers: [
// {
// name = "local" # MISP name
// url = "http://localhost/" # URL or MISP
// auth {
// type = key
// key = "***" # MISP API key
// }
// wsConfig {} # HTTP client configuration (SSL and proxy)
// }
// ]
//}
//Streaming
stream.longpolling {
//Maximum time a stream request waits for new element
refresh = 1m
//Lifetime of the stream session without request
cache = 15m
nextItemMaxWait = 500ms
globalMaxWait = 1s
}
//Max textual content length
play.http.parser.maxMemoryBuffer=1M
//Max file size
play.http.parser.maxDiskBuffer=1G
//Define maximum size of attachments (default 10MB)
//play.http.parser.maxDiskBuffer: 1GB
I Hope you will help me and at your disposal if you want any additional information.
TheHive4 fails to start
Request Type Bug
Work Environment
Problem Description
TheHive service fails to start .
Steps to Reproduce
Complementary information
ElasticSearch is running and Cassandra not installed because i want to use ElasticSearch. The output fro the cmd # java -version
PS : Thehive and ElasticSearch are in the same server, and for installation of the hive i use the RPM package #yum install thehive4.
I Hope you will help me and at your disposal if you want any additional information.