datahub-project / datahub

The Metadata Platform for your Data Stack
https://datahubproject.io
Apache License 2.0
9.81k stars 2.89k forks source link

Datasets not visible on the UI #1179

Closed ankurgadgilwar closed 6 years ago

ankurgadgilwar commented 6 years ago

Error: Could not find or load main class wherehows.common.jobs.Launcher

Above is the error I am getting while running the HIVE_METADATA_ETL in stderr file. 2018-05-22 17:26:01 DEBUG application:74 - run command : [java, -cp, , -Dconfig=/WhereHows/temporary/HIVE_METADATA_ETL/2/2.properties, -DCONTEXT=HIVE_METADATA_ETL, -Dlogback.configurationFile=etl_logback.xml, -DLOG_DIR=/WhereHows/temporary/HIVE_METADATA_ETL/2, wherehows.common.jobs.Launcher] ; timeout: 12000 2018-05-22 17:26:01 ERROR application:123 - *** Process + 626585 failed, status: 1 2018-05-22 17:26:01 ERROR application:124 - Error Details:

java.lang.Exception: Process + 626585 failed at actors.EtlJobActor.onReceive(EtlJobActor.java:125) at akka.actor.UntypedActor$$anonfun$receive$1.applyOrElse(UntypedActor.scala:167) at akka.actor.Actor$class.aroundReceive(Actor.scala:467) at akka.actor.UntypedActor.aroundReceive(UntypedActor.scala:97) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516) at akka.actor.ActorCell.invoke(ActorCell.scala:487) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238) at akka.dispatch.Mailbox.run(Mailbox.scala:220) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 2018-05-22 17:26:01 ERROR application:140 - ETL job (jobName:HIVE_METADATA_ETL, whEtlExecId:2) got a problem <============-> 96% EXECUTING [3h 37m 48s]

That's the actual error. Please help.

mtHuberty commented 6 years ago

Can you locate the actual log file for that job? They contain more details and I know they get saved somewhere but I can't remember where off the top of my head.

mtHuberty commented 6 years ago

For example, the location of my log file for my teradata etl job is /var/lib/docker/aufs/diff/e4bd6d24b56d1964b388495f204abf91c43409fdbe0eb28228c8c4911c0451d7/application/logs/etl/TERADATA/8/td_metadata.log

Granted, I'm using a docker image to run the backend, so the 'var/lib/docker/aufs/diff/long-number' part will likely differ.

Assuming you're on linux, try doing a sudo find / -name '*.log'. Replace the * with the exact name of the job if you know it and you should find it right away.

ankurgadgilwar commented 6 years ago

@mtHuberty you see, that's the issue. I went to the usual location where the logs should have been, miraculously, they aren't there. All I see is the stderr file in some directory and it doesn't give me much insight into the error. Thank you so much for your input, I'll try looking into this. Will keep you posted.

ankurgadgilwar commented 6 years ago

There is absolutely nothing on the entire system which captures the log for the job file.

mtHuberty commented 6 years ago

Can you post the contents of your HIVE_METADATA_ETL.job file? Make sure to redact/remove private DB connection details.

I don't want to get your hopes up about me being able to help because I'm pretty new to this myself, but I've been struggling through the WhereHows world for a few weeks now and maybe I'll see something that you don't.

ankurgadgilwar commented 6 years ago
# Fetch Hive dataset metadata

# Common ETL configs
job.class=metadata.etl.dataset.hive.HiveMetadataEtl
job.cron.expr=0 5 * * * ? *
job.timeout=12000
#job.cmd.params=
#job.disabled=1
job.ref.id=65
#useSSL=false
# hive metastore jdbc url
hive.metastore.jdbc.url=*URL*
# hive metastore jdbc driver
hive.metastore.jdbc.driver=com.mysql.jdbc.Driver

# hive metastore user name
hive.metastore.username=hive

# hive metastore password
hive.metastore.password=*password*

# hive metastore DB reconnect interval
hive.metastore.reconnect.time=300

#hive.database_black_list=your_databsae_black_list

#hive.database_white_list=your_database_white_list

# Place to store the schema csv file
hive.schema_csv_file=/var/tmp/hive_schema.csv

# Place to store the schema json file
hive.schema_json_file=/var/tmp/hive_schema.json

# Place to store the field metadata csv file
hive.field_metadata=/var/tmp/hive_field_metadata.csv

# Place to store the hdfs map csv file
hive.hdfs_map_csv_file=/var/tmp/hive_hdfs_map.csv

# Place to store the hive instance csv file
hive.instance_csv_file=/var/tmp/hive_instance.csv

# Place to store the dependency csv file
hive.dependency_csv_file=/var/tmp/hive_dependency.csv

# HDFS namenode IPC URI
hdfs.namenode.ipc.uri=*ipc.uri*

# Enable/disable kerberos authentication & related configs
#kerberos.auth=False
#kerberos.keytab.file=your_keytab_file
#kerberos.principal=your_principal

#krb5.kdc=your_kdc
#krb5.realm=your_realm

# innodb_lock_wait_timeout when accessing MySQL Db
innodb_lock_wait_timeout=1500
mtHuberty commented 6 years ago

Interesting...there's no place in the config that sets a job specific log file. Take a look at my teradata job file for comparison. (I used the TERADATA_METADATA_ETL.job template and just put my details in it, so this was set up as-is before I used it)


# Common ETL configs
job.class=metadata.etl.dataset.teradata.TeradataMetadataEtl
job.cron.expr=0 44 19 ? * * *
job.timeout=15000
#job.cmd.params=
#job.disabled=1
job.ref.id=3

# Teradata database username
teradata.db.username=*user*

# Teradata database password
teradata.db.password=*password*

# Teradata JDBC driver
teradata.db.driver=com.teradata.jdbc.TeraDriver

# Teradata database JDBC URL, excluding username and password
teradata.db.jdbc.url=*url*

# The databases to be collected, comma separated, e.g. financial,manufacturing
teradata.databases=*list of DBs*

# Default Teradata database for connection
teradata.default_database=

# Place to store the log file
teradata.log=td_metadata.log

# Place to store metadata data file
teradata.metadata=td_metadata.dat

# Place to store schema data file
teradata.schema_output=td_schema.json

# Place to store field metadata CSV file
teradata.field_metadata=td_field_metadata.dat

# Place to store sample data file
teradata.sample_output=td_sampledata.csv

# Comma-separated tables you want to skip sample data collection (e.g., for security reasons).
teradata.sample.skip.list=

# Whether to get sample data (True/False)
teradata.load_sample=True

# Collect sample data collection only for certain weekdays
teradata.collect.sample.data.days=

# innodb_lock_wait_timeout when accessing MySQL Db
innodb_lock_wait_timeout=1500

Notice how there's a line for "#Place to store the log file"

Because of this, I would try 2 things, and the second one is a longshot.

First, double check that a generic log file wasn't created when your job ran, but named something else (not the name of your job, like we'd expect).

Second, you COULD try setting some variables like hive.log=HIVE_METADATA_ETL.log and hive.metadata.log=HIVE_METADATA_ETL.log in your HIVE_METADATA_ETL.job file to see if they accidentally left out the line for the input of that log file location. It very well might exist in the java file that parses these config files, but someone may have forgotten to add it in to the config. I'm not savy enough to know where to go looking in the java code, so I'd take a couple stabs in the dark at what they might be named and see if I got lucky. Told you this one would be a long shot.

Then you'd have to rerun the job and see if it created the log file.

ankurgadgilwar commented 6 years ago

I think this might actually help me to either locate the .log file or at least setting up a new one. Will try either of the methods. Thank you very much for your help! Will keep you posted.

ankurgadgilwar commented 6 years ago

@mtHuberty you have mentioned the change pf version in https://github.com/linkedin/WhereHows/issues/1131 can you please help me locate the file where I need to make these changes? I am receiveing similar errors in the link mentioned above.

mtHuberty commented 6 years ago

image

mtHuberty commented 6 years ago

In the screenshot above, navigate to the tags tab and then select 1.0.0. Then you can clone that version. Just remember, you'll be starting from scratch - so you'll have to re-add any drivers, reconfigure your job file, maybe edit your $CLASSPATH etc again for this new installation. Also something I noticed is that the job config template files are SLIGHTLY different for some of the built-in ETL jobs, so make sure not to just copy/paste your configs from the version you're using now without checking.

ankurgadgilwar commented 6 years ago

sure, thank you very much for all the help @mtHuberty !

alidisi commented 6 years ago

the reason for the error is that the java.class.path is null, the buildProcess class start the java cmd report that no class found error. -cp is null [java, -cp, , -Dconfig=/var/tmp/wherehows/hive/79/79.properties, -DCONTEXT=hive, -Dlogback.configurationFile=etl_logback.xml, -DLOG_DIR=/var/tmp/wherehows/hive/79, wherehows.common.jobs.Launcher]

i set the class path in WhereHows/wherehows-backend/app/actors/ConfigUtil.java String classPath = System.getProperty("java.class.path"); String classPath = yourclasspath the error resolve.

ankurgadgilwar commented 6 years ago

@alidisi thanx for the insight. But I m trying to edit the property, and am getting an error, probably I am not putting the right path.

ankurgadgilwar commented 6 years ago

The previous error was indeed resolved, thanks to your workaround, but in the Frontend, I have encountered the following issue now.

2018-05-25 13:38:34 ERROR application:916 - Fetch compliance Error:
javax.persistence.NoResultException: No entity found for query
        at org.hibernate.query.internal.AbstractProducedQuery.getSingleResult(AbstractProducedQuery.java:1446) ~[hibernate-core-5.2.5.Final.jar:5.2.5.Final]
        at org.hibernate.query.criteria.internal.compile.CriteriaQueryTypeQueryAdapter.getSingleResult(CriteriaQueryTypeQueryAdapter.java:107) ~[hibernate-core-5.2.5.Final.jar:5.2.5.Final]
        at wherehows.dao.table.BaseDao.findBy(BaseDao.java:81) ~[wherehows-dao.jar:na]
        at wherehows.dao.table.DatasetComplianceDao.findComplianceById(DatasetComplianceDao.java:52) ~[wherehows-dao.jar:na]
        at wherehows.dao.table.DatasetComplianceDao.getDatasetComplianceByDatasetId(DatasetComplianceDao.java:57) ~[wherehows-dao.jar:na]
        at controllers.api.v1.Dataset.getDatasetCompliance(Dataset.java:908) ~[wherehows-frontend.jar:na]
        at router.Routes$$anonfun$routes$1$$anonfun$applyOrElse$74$$anonfun$apply$74.apply(Routes.scala:2976) [wherehows-frontend.jar:na]
        at router.Routes$$anonfun$routes$1$$anonfun$applyOrElse$74$$anonfun$apply$74.apply(Routes.scala:2976) [wherehows-frontend.jar:na]
        at play.core.routing.HandlerInvokerFactory$$anon$5.resultCall(HandlerInvoker.scala:139) [play_2.10-2.4.11.jar:2.4.11]
        at play.core.routing.HandlerInvokerFactory$JavaActionInvokerFactory$$anon$14$$anon$3$$anon$1.invocation(HandlerInvoker.scala:127) [play_2.10-2.4.11.jar:2.4.11]
        at play.core.j.JavaAction$$anon$1.call(JavaAction.scala:70) [play_2.10-2.4.11.jar:2.4.11]
        at play.http.DefaultHttpRequestHandler$1.call(DefaultHttpRequestHandler.java:20) [play_2.10-2.4.11.jar:2.4.11]
        at play.core.j.JavaAction$$anonfun$7.apply(JavaAction.scala:94) [play_2.10-2.4.11.jar:2.4.11]
        at play.core.j.JavaAction$$anonfun$7.apply(JavaAction.scala:94) [play_2.10-2.4.11.jar:2.4.11]
        at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) [scala-library-2.10.5.jar:na]
        at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) [scala-library-2.10.5.jar:na]
        at play.core.j.HttpExecutionContext$$anon$2.run(HttpExecutionContext.scala:40) [play_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Execution$trampoline$.execute(Execution.scala:70) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.core.j.HttpExecutionContext.execute(HttpExecutionContext.scala:32) [play_2.10-2.4.11.jar:2.4.11]
        at scala.concurrent.impl.Future$.apply(Future.scala:31) [scala-library-2.10.5.jar:na]
        at scala.concurrent.Future$.apply(Future.scala:485) [scala-library-2.10.5.jar:na]
        at play.core.j.JavaAction.apply(JavaAction.scala:94) [play_2.10-2.4.11.jar:2.4.11]
        at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4$$anonfun$apply$5.apply(Action.scala:105) [play_2.10-2.4.11.jar:2.4.11]
        at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4$$anonfun$apply$5.apply(Action.scala:105) [play_2.10-2.4.11.jar:2.4.11]
        at play.utils.Threads$.withContextClassLoader(Threads.scala:21) [play_2.10-2.4.11.jar:2.4.11]
        at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4.apply(Action.scala:104) [play_2.10-2.4.11.jar:2.4.11]
        at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4.apply(Action.scala:103) [play_2.10-2.4.11.jar:2.4.11]
        at scala.Option.map(Option.scala:145) [scala-library-2.10.5.jar:na]
        at play.api.mvc.Action$$anonfun$apply$1.apply(Action.scala:103) [play_2.10-2.4.11.jar:2.4.11]
        at play.api.mvc.Action$$anonfun$apply$1.apply(Action.scala:96) [play_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Iteratee$$anonfun$mapM$1.apply(Iteratee.scala:524) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Iteratee$$anonfun$mapM$1.apply(Iteratee.scala:524) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Iteratee$$anonfun$flatMapM$1.apply(Iteratee.scala:560) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Iteratee$$anonfun$flatMapM$1.apply(Iteratee.scala:560) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Iteratee$$anonfun$flatMap$1$$anonfun$apply$13.apply(Iteratee.scala:536) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Iteratee$$anonfun$flatMap$1$$anonfun$apply$13.apply(Iteratee.scala:536) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) [scala-library-2.10.5.jar:na]
        at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) [scala-library-2.10.5.jar:na]
        at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40) [akka-actor_2.10-2.3.13.jar:na]
        at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397) [akka-actor_2.10-2.3.13.jar:na]
        at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) [scala-library-2.10.5.jar:na]
        at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) [scala-library-2.10.5.jar:na]
        at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) [scala-library-2.10.5.jar:na]
        at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) [scala-library-2.10.5.jar:na]
2018-05-25 13:38:34 ERROR application:216 -

! @783p4l3f5 - Internal server error, for (GET) [/api/v2/list/complianceDataTypes] ->

play.api.http.HttpErrorHandlerExceptions$$anon$1: Execution exception[[UnsupportedOperationException: Operation not implemented]]
        at play.api.http.HttpErrorHandlerExceptions$.throwableToUsefulException(HttpErrorHandler.scala:265) ~[play_2.10-2.4.11.jar:2.4.11]
        at play.api.http.DefaultHttpErrorHandler.onServerError(HttpErrorHandler.scala:191) ~[play_2.10-2.4.11.jar:2.4.11]
        at play.api.GlobalSettings$class.onError(GlobalSettings.scala:179) [play_2.10-2.4.11.jar:2.4.11]
        at play.api.DefaultGlobal$.onError(GlobalSettings.scala:212) [play_2.10-2.4.11.jar:2.4.11]
        at play.api.http.GlobalSettingsHttpErrorHandler.onServerError(HttpErrorHandler.scala:94) [play_2.10-2.4.11.jar:2.4.11]
        at play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$9$$anonfun$apply$1.applyOrElse(PlayDefaultUpstreamHandler.scala:162) [play-netty-server_2.10-2.4.11.jar:2.4.11]
        at play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$9$$anonfun$apply$1.applyOrElse(PlayDefaultUpstreamHandler.scala:159) [play-netty-server_2.10-2.4.11.jar:2.4.11]
        at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33) [scala-library-2.10.5.jar:na]
        at scala.util.Failure$$anonfun$recover$1.apply(Try.scala:185) [scala-library-2.10.5.jar:na]
        at scala.util.Try$.apply(Try.scala:161) [scala-library-2.10.5.jar:na]
        at scala.util.Failure.recover(Try.scala:185) [scala-library-2.10.5.jar:na]
        at scala.concurrent.Future$$anonfun$recover$1.apply(Future.scala:324) [scala-library-2.10.5.jar:na]
        at scala.concurrent.Future$$anonfun$recover$1.apply(Future.scala:324) [scala-library-2.10.5.jar:na]
        at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32) [scala-library-2.10.5.jar:na]
        at play.api.libs.iteratee.Execution$trampoline$.executeScheduled(Execution.scala:109) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Execution$trampoline$.execute(Execution.scala:71) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40) [scala-library-2.10.5.jar:na]
        at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248) [scala-library-2.10.5.jar:na]
        at scala.concurrent.Promise$class.complete(Promise.scala:55) [scala-library-2.10.5.jar:na]
        at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:153) [scala-library-2.10.5.jar:na]
        at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:23) [scala-library-2.10.5.jar:na]
        at play.core.j.HttpExecutionContext$$anon$2.run(HttpExecutionContext.scala:40) [play_2.10-2.4.11.jar:2.4.11]
        at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40) [akka-actor_2.10-2.3.13.jar:na]
        at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397) [akka-actor_2.10-2.3.13.jar:na]
        at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) [scala-library-2.10.5.jar:na]
        at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) [scala-library-2.10.5.jar:na]
        at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) [scala-library-2.10.5.jar:na]
        at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) [scala-library-2.10.5.jar:na]
Caused by: java.lang.UnsupportedOperationException: Operation not implemented
        at wherehows.dao.view.DataTypesViewDao.getAllComplianceDataTypes(DataTypesViewDao.java:30) ~[wherehows-dao.jar:na]
        at controllers.api.v2.Dataset.lambda$getComplianceDataTypes$7(Dataset.java:99) ~[wherehows-frontend.jar:na]
        at play.core.j.FPromiseHelper$$anonfun$promise$2.apply(FPromiseHelper.scala:36) ~[play_2.10-2.4.11.jar:2.4.11]
        at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) [scala-library-2.10.5.jar:na]
        at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) [scala-library-2.10.5.jar:na]
        ... 7 common frames omitted
2018-05-25 13:38:34 ERROR application:993 - Fetch compliance suggestion Error:
java.lang.UnsupportedOperationException: Compliance Suggestion not implemented.
        at wherehows.dao.table.DatasetComplianceDao.findComplianceSuggestionByUrn(DatasetComplianceDao.java:131) ~[wherehows-dao.jar:na]
        at controllers.api.v1.Dataset.getDatasetSuggestedCompliance(Dataset.java:985) [wherehows-frontend.jar:na]
        at controllers.api.v1.Dataset.getDatasetSuggestedCompliance(Dataset.java:971) [wherehows-frontend.jar:na]
        at router.Routes$$anonfun$routes$1$$anonfun$applyOrElse$76$$anonfun$apply$76.apply(Routes.scala:2988) [wherehows-frontend.jar:na]
        at router.Routes$$anonfun$routes$1$$anonfun$applyOrElse$76$$anonfun$apply$76.apply(Routes.scala:2988) [wherehows-frontend.jar:na]
        at play.core.routing.HandlerInvokerFactory$$anon$5.resultCall(HandlerInvoker.scala:139) [play_2.10-2.4.11.jar:2.4.11]
        at play.core.routing.HandlerInvokerFactory$JavaActionInvokerFactory$$anon$14$$anon$3$$anon$1.invocation(HandlerInvoker.scala:127) [play_2.10-2.4.11.jar:2.4.11]
        at play.core.j.JavaAction$$anon$1.call(JavaAction.scala:70) [play_2.10-2.4.11.jar:2.4.11]
        at play.http.DefaultHttpRequestHandler$1.call(DefaultHttpRequestHandler.java:20) [play_2.10-2.4.11.jar:2.4.11]
        at play.core.j.JavaAction$$anonfun$7.apply(JavaAction.scala:94) [play_2.10-2.4.11.jar:2.4.11]
        at play.core.j.JavaAction$$anonfun$7.apply(JavaAction.scala:94) [play_2.10-2.4.11.jar:2.4.11]
        at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) [scala-library-2.10.5.jar:na]
        at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) [scala-library-2.10.5.jar:na]
        at play.core.j.HttpExecutionContext$$anon$2.run(HttpExecutionContext.scala:40) [play_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Execution$trampoline$.execute(Execution.scala:70) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.core.j.HttpExecutionContext.execute(HttpExecutionContext.scala:32) [play_2.10-2.4.11.jar:2.4.11]
        at scala.concurrent.impl.Future$.apply(Future.scala:31) [scala-library-2.10.5.jar:na]
        at scala.concurrent.Future$.apply(Future.scala:485) [scala-library-2.10.5.jar:na]
        at play.core.j.JavaAction.apply(JavaAction.scala:94) [play_2.10-2.4.11.jar:2.4.11]
        at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4$$anonfun$apply$5.apply(Action.scala:105) [play_2.10-2.4.11.jar:2.4.11]
        at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4$$anonfun$apply$5.apply(Action.scala:105) [play_2.10-2.4.11.jar:2.4.11]
        at play.utils.Threads$.withContextClassLoader(Threads.scala:21) [play_2.10-2.4.11.jar:2.4.11]
        at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4.apply(Action.scala:104) [play_2.10-2.4.11.jar:2.4.11]
        at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4.apply(Action.scala:103) [play_2.10-2.4.11.jar:2.4.11]
        at scala.Option.map(Option.scala:145) [scala-library-2.10.5.jar:na]
        at play.api.mvc.Action$$anonfun$apply$1.apply(Action.scala:103) [play_2.10-2.4.11.jar:2.4.11]
        at play.api.mvc.Action$$anonfun$apply$1.apply(Action.scala:96) [play_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Iteratee$$anonfun$mapM$1.apply(Iteratee.scala:524) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Iteratee$$anonfun$mapM$1.apply(Iteratee.scala:524) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Iteratee$$anonfun$flatMapM$1.apply(Iteratee.scala:560) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Iteratee$$anonfun$flatMapM$1.apply(Iteratee.scala:560) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Iteratee$$anonfun$flatMap$1$$anonfun$apply$13.apply(Iteratee.scala:536) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Iteratee$$anonfun$flatMap$1$$anonfun$apply$13.apply(Iteratee.scala:536) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) [scala-library-2.10.5.jar:na]
        at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) [scala-library-2.10.5.jar:na]
        at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40) [akka-actor_2.10-2.3.13.jar:na]
        at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397) [akka-actor_2.10-2.3.13.jar:na]
        at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) [scala-library-2.10.5.jar:na]
        at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) [scala-library-2.10.5.jar:na]
        at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) [scala-library-2.10.5.jar:na]
        at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) [scala-library-2.10.5.jar:na]
2018-05-25 13:38:34 WARN  application:139 - Failed to get dataset view
java.lang.NullPointerException: null
        at wherehows.dao.view.DatasetViewDao.fillDatasetViewFromDictDataset(DatasetViewDao.java:94) ~[wherehows-dao.jar:na]
        at wherehows.dao.view.DatasetViewDao.getDatasetView(DatasetViewDao.java:74) ~[wherehows-dao.jar:na]
        at controllers.api.v1.Dataset.getDatasetViewById(Dataset.java:137) ~[wherehows-frontend.jar:na]
        at router.Routes$$anonfun$routes$1$$anonfun$applyOrElse$45$$anonfun$apply$45.apply(Routes.scala:2802) [wherehows-frontend.jar:na]
        at router.Routes$$anonfun$routes$1$$anonfun$applyOrElse$45$$anonfun$apply$45.apply(Routes.scala:2802) [wherehows-frontend.jar:na]
        at play.core.routing.HandlerInvokerFactory$$anon$5.resultCall(HandlerInvoker.scala:139) [play_2.10-2.4.11.jar:2.4.11]
        at play.core.routing.HandlerInvokerFactory$JavaActionInvokerFactory$$anon$14$$anon$3$$anon$1.invocation(HandlerInvoker.scala:127) [play_2.10-2.4.11.jar:2.4.11]
        at play.core.j.JavaAction$$anon$1.call(JavaAction.scala:70) [play_2.10-2.4.11.jar:2.4.11]
        at play.http.DefaultHttpRequestHandler$1.call(DefaultHttpRequestHandler.java:20) [play_2.10-2.4.11.jar:2.4.11]
        at play.core.j.JavaAction$$anonfun$7.apply(JavaAction.scala:94) [play_2.10-2.4.11.jar:2.4.11]
        at play.core.j.JavaAction$$anonfun$7.apply(JavaAction.scala:94) [play_2.10-2.4.11.jar:2.4.11]
        at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) [scala-library-2.10.5.jar:na]
        at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) [scala-library-2.10.5.jar:na]
        at play.core.j.HttpExecutionContext$$anon$2.run(HttpExecutionContext.scala:40) [play_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Execution$trampoline$.execute(Execution.scala:70) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.core.j.HttpExecutionContext.execute(HttpExecutionContext.scala:32) [play_2.10-2.4.11.jar:2.4.11]
        at scala.concurrent.impl.Future$.apply(Future.scala:31) [scala-library-2.10.5.jar:na]
        at scala.concurrent.Future$.apply(Future.scala:485) [scala-library-2.10.5.jar:na]
        at play.core.j.JavaAction.apply(JavaAction.scala:94) [play_2.10-2.4.11.jar:2.4.11]
        at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4$$anonfun$apply$5.apply(Action.scala:105) [play_2.10-2.4.11.jar:2.4.11]
        at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4$$anonfun$apply$5.apply(Action.scala:105) [play_2.10-2.4.11.jar:2.4.11]
        at play.utils.Threads$.withContextClassLoader(Threads.scala:21) [play_2.10-2.4.11.jar:2.4.11]
        at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4.apply(Action.scala:104) [play_2.10-2.4.11.jar:2.4.11]
        at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4.apply(Action.scala:103) [play_2.10-2.4.11.jar:2.4.11]
        at scala.Option.map(Option.scala:145) [scala-library-2.10.5.jar:na]
        at play.api.mvc.Action$$anonfun$apply$1.apply(Action.scala:103) [play_2.10-2.4.11.jar:2.4.11]
        at play.api.mvc.Action$$anonfun$apply$1.apply(Action.scala:96) [play_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Iteratee$$anonfun$mapM$1.apply(Iteratee.scala:524) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Iteratee$$anonfun$mapM$1.apply(Iteratee.scala:524) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Iteratee$$anonfun$flatMapM$1.apply(Iteratee.scala:560) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Iteratee$$anonfun$flatMapM$1.apply(Iteratee.scala:560) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Iteratee$$anonfun$flatMap$1$$anonfun$apply$13.apply(Iteratee.scala:536) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Iteratee$$anonfun$flatMap$1$$anonfun$apply$13.apply(Iteratee.scala:536) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) [scala-library-2.10.5.jar:na]
        at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) [scala-library-2.10.5.jar:na]
        at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40) [akka-actor_2.10-2.3.13.jar:na]
        at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397) [akka-actor_2.10-2.3.13.jar:na]
        at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) [scala-library-2.10.5.jar:na]
        at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) [scala-library-2.10.5.jar:na]
        at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) [scala-library-2.10.5.jar:na]
        at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) [scala-library-2.10.5.jar:na]
2018-05-25 13:38:35 ERROR application:216 -

! @783p4l4im - Internal server error, for (GET) [/api/v1/datasets/1/access] ->

play.api.http.HttpErrorHandlerExceptions$$anon$1: Execution exception[[BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [SELECT DISTINCT partition_grain FROM log_dataset_instance_load_status WHERE dataset_id = ? order by 1]; nested exception is com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'wherehows.log_dataset_instance_load_status' doesn't exist]]
        at play.api.http.HttpErrorHandlerExceptions$.throwableToUsefulException(HttpErrorHandler.scala:265) ~[play_2.10-2.4.11.jar:2.4.11]
        at play.api.http.DefaultHttpErrorHandler.onServerError(HttpErrorHandler.scala:191) ~[play_2.10-2.4.11.jar:2.4.11]
        at play.api.GlobalSettings$class.onError(GlobalSettings.scala:179) [play_2.10-2.4.11.jar:2.4.11]
        at play.api.DefaultGlobal$.onError(GlobalSettings.scala:212) [play_2.10-2.4.11.jar:2.4.11]
        at play.api.http.GlobalSettingsHttpErrorHandler.onServerError(HttpErrorHandler.scala:94) [play_2.10-2.4.11.jar:2.4.11]
        at play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$9$$anonfun$apply$1.applyOrElse(PlayDefaultUpstreamHandler.scala:162) [play-netty-server_2.10-2.4.11.jar:2.4.11]
        at play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$9$$anonfun$apply$1.applyOrElse(PlayDefaultUpstreamHandler.scala:159) [play-netty-server_2.10-2.4.11.jar:2.4.11]
        at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33) [scala-library-2.10.5.jar:na]
        at scala.util.Failure$$anonfun$recover$1.apply(Try.scala:185) [scala-library-2.10.5.jar:na]
        at scala.util.Try$.apply(Try.scala:161) [scala-library-2.10.5.jar:na]
        at scala.util.Failure.recover(Try.scala:185) [scala-library-2.10.5.jar:na]
        at scala.concurrent.Future$$anonfun$recover$1.apply(Future.scala:324) [scala-library-2.10.5.jar:na]
        at scala.concurrent.Future$$anonfun$recover$1.apply(Future.scala:324) [scala-library-2.10.5.jar:na]
        at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32) [scala-library-2.10.5.jar:na]
        at play.api.libs.iteratee.Execution$trampoline$.executeScheduled(Execution.scala:109) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Execution$trampoline$.execute(Execution.scala:71) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40) [scala-library-2.10.5.jar:na]
        at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248) [scala-library-2.10.5.jar:na]
        at scala.concurrent.Promise$class.complete(Promise.scala:55) [scala-library-2.10.5.jar:na]
        at scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:153) [scala-library-2.10.5.jar:na]
        at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:23) [scala-library-2.10.5.jar:na]
        at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40) [akka-actor_2.10-2.3.13.jar:na]
        at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397) [akka-actor_2.10-2.3.13.jar:na]
        at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) [scala-library-2.10.5.jar:na]
        at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) [scala-library-2.10.5.jar:na]
        at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) [scala-library-2.10.5.jar:na]
        at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) [scala-library-2.10.5.jar:na]
Caused by: org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [SELECT DISTINCT partition_grain FROM log_dataset_instance_load_status WHERE dataset_id = ? order by 1]; nested exception is com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'wherehows.log_dataset_instance_load_status' doesn't exist
        at org.springframework.jdbc.support.SQLErrorCodeSQLExceptionTranslator.doTranslate(SQLErrorCodeSQLExceptionTranslator.java:231) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
        at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:73) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
        at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:660) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
        at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:695) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
        at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:727) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
        at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:737) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
        at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:787) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
        at org.springframework.jdbc.core.JdbcTemplate.queryForList(JdbcTemplate.java:882) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
        at dao.DatasetsDAO.getDatasetPartitionGains(DatasetsDAO.java:1850) ~[wherehows-frontend.jar:na]
        at dao.DatasetsDAO.getDatasetAccessibilty(DatasetsDAO.java:1860) ~[wherehows-frontend.jar:na]
        at controllers.api.v1.Dataset.getDatasetAccess(Dataset.java:875) ~[wherehows-frontend.jar:na]
        at router.Routes$$anonfun$routes$1$$anonfun$applyOrElse$56$$anonfun$apply$56.apply(Routes.scala:2868) ~[wherehows-frontend.jar:na]
        at router.Routes$$anonfun$routes$1$$anonfun$applyOrElse$56$$anonfun$apply$56.apply(Routes.scala:2868) ~[wherehows-frontend.jar:na]
        at play.core.routing.HandlerInvokerFactory$$anon$4.resultCall(HandlerInvoker.scala:136) ~[play_2.10-2.4.11.jar:2.4.11]
        at play.core.routing.HandlerInvokerFactory$JavaActionInvokerFactory$$anon$14$$anon$3$$anon$1.invocation(HandlerInvoker.scala:127) ~[play_2.10-2.4.11.jar:2.4.11]
        at play.core.j.JavaAction$$anon$1.call(JavaAction.scala:70) ~[play_2.10-2.4.11.jar:2.4.11]
        at play.http.DefaultHttpRequestHandler$1.call(DefaultHttpRequestHandler.java:20) ~[play_2.10-2.4.11.jar:2.4.11]
        at play.core.j.JavaAction$$anonfun$7.apply(JavaAction.scala:94) ~[play_2.10-2.4.11.jar:2.4.11]
        at play.core.j.JavaAction$$anonfun$7.apply(JavaAction.scala:94) ~[play_2.10-2.4.11.jar:2.4.11]
        at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) [scala-library-2.10.5.jar:na]
        at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) [scala-library-2.10.5.jar:na]
        at play.core.j.HttpExecutionContext$$anon$2.run(HttpExecutionContext.scala:40) ~[play_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Execution$trampoline$.execute(Execution.scala:70) [play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.core.j.HttpExecutionContext.execute(HttpExecutionContext.scala:32) ~[play_2.10-2.4.11.jar:2.4.11]
        at scala.concurrent.impl.Future$.apply(Future.scala:31) ~[scala-library-2.10.5.jar:na]
        at scala.concurrent.Future$.apply(Future.scala:485) ~[scala-library-2.10.5.jar:na]
        at play.core.j.JavaAction.apply(JavaAction.scala:94) ~[play_2.10-2.4.11.jar:2.4.11]
        at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4$$anonfun$apply$5.apply(Action.scala:105) ~[play_2.10-2.4.11.jar:2.4.11]
        at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4$$anonfun$apply$5.apply(Action.scala:105) ~[play_2.10-2.4.11.jar:2.4.11]
        at play.utils.Threads$.withContextClassLoader(Threads.scala:21) ~[play_2.10-2.4.11.jar:2.4.11]
        at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4.apply(Action.scala:104) ~[play_2.10-2.4.11.jar:2.4.11]
        at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4.apply(Action.scala:103) ~[play_2.10-2.4.11.jar:2.4.11]
        at scala.Option.map(Option.scala:145) ~[scala-library-2.10.5.jar:na]
        at play.api.mvc.Action$$anonfun$apply$1.apply(Action.scala:103) ~[play_2.10-2.4.11.jar:2.4.11]
        at play.api.mvc.Action$$anonfun$apply$1.apply(Action.scala:96) ~[play_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Iteratee$$anonfun$mapM$1.apply(Iteratee.scala:524) ~[play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Iteratee$$anonfun$mapM$1.apply(Iteratee.scala:524) ~[play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Iteratee$$anonfun$flatMapM$1.apply(Iteratee.scala:560) ~[play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Iteratee$$anonfun$flatMapM$1.apply(Iteratee.scala:560) ~[play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Iteratee$$anonfun$flatMap$1$$anonfun$apply$13.apply(Iteratee.scala:536) ~[play-iteratees_2.10-2.4.11.jar:2.4.11]
        at play.api.libs.iteratee.Iteratee$$anonfun$flatMap$1$$anonfun$apply$13.apply(Iteratee.scala:536) ~[play-iteratees_2.10-2.4.11.jar:2.4.11]
        at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) [scala-library-2.10.5.jar:na]
        at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) [scala-library-2.10.5.jar:na]
        ... 6 common frames omitted
Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'wherehows.log_dataset_instance_load_status' doesn't exist
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:1.8.0_121]
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[na:1.8.0_121]
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:1.8.0_121]
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[na:1.8.0_121]
        at com.mysql.jdbc.Util.handleNewInstance(Util.java:425) ~[mysql-connector-java-5.1.40.jar:5.1.40]
        at com.mysql.jdbc.Util.getInstance(Util.java:408) ~[mysql-connector-java-5.1.40.jar:5.1.40]
        at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:943) ~[mysql-connector-java-5.1.40.jar:5.1.40]
        at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3970) ~[mysql-connector-java-5.1.40.jar:5.1.40]
        at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3906) ~[mysql-connector-java-5.1.40.jar:5.1.40]
        at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2524) ~[mysql-connector-java-5.1.40.jar:5.1.40]
        at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2677) ~[mysql-connector-java-5.1.40.jar:5.1.40]
        at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2549) ~[mysql-connector-java-5.1.40.jar:5.1.40]
        at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:1861) ~[mysql-connector-java-5.1.40.jar:5.1.40]
        at com.mysql.jdbc.PreparedStatement.executeQuery(PreparedStatement.java:1962) ~[mysql-connector-java-5.1.40.jar:5.1.40]
        at com.jolbox.bonecp.PreparedStatementHandle.executeQuery(PreparedStatementHandle.java:174) ~[bonecp-0.8.0.RELEASE.jar:na]
        at org.springframework.jdbc.core.JdbcTemplate$1.doInPreparedStatement(JdbcTemplate.java:703) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
        at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:644) ~[spring-jdbc-4.1.6.RELEASE.jar:4.1.6.RELEASE]
        ... 46 common frames omitted
ankurgadgilwar commented 6 years ago

Facing the below issue now. Any help would be appreciated. I am using a v1.0.0 tag. java.lang.Exception: Process + 747770 failed at akka.actor.UntypedActor$$anonfun$receive$1.applyOrElse(UntypedActor.scala:167) at akka.actor.Actor$class.aroundReceive(Actor.scala:467) at actors.EtlJobActor.onReceive(EtlJobActor.java:123) at akka.actor.UntypedActor.aroundReceive(UntypedActor.scala:97) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516) at akka.actor.ActorCell.invoke(ActorCell.scala:487) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238) at akka.dispatch.Mailbox.run(Mailbox.scala:220) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 03:10:07.811 [QUIET] [system.out] 03:10:07.811 [QUIET] [system.out] 2018-06-07 03:10:07 ERROR application:138 - ETL job (jobName:HIVE_METADATA_ETL, whEtlExecId:8) got a problem

mtHuberty commented 6 years ago

I'm facing a very similar issue today:

2018-06-07T10:22:16.91-0500 [APP/PROC/WEB/0] OUT 2018-06-07 15:22:16 ERROR application:123 - *** Process + 65 failed, status: 1
   2018-06-07T10:22:16.91-0500 [APP/PROC/WEB/0] ERR java.lang.Exception: Process + 65 failed
   2018-06-07T10:22:16.91-0500 [APP/PROC/WEB/0] ERR     at actors.EtlJobActor.onReceive(EtlJobActor.java:125)
   2018-06-07T10:22:16.91-0500 [APP/PROC/WEB/0] ERR     at akka.actor.UntypedActor$$anonfun$receive$1.applyOrElse(UntypedActor.scala:167)
   2018-06-07T10:22:16.91-0500 [APP/PROC/WEB/0] ERR     at akka.actor.Actor$class.aroundReceive(Actor.scala:467)
   2018-06-07T10:22:16.91-0500 [APP/PROC/WEB/0] ERR     at akka.actor.UntypedActor.aroundReceive(UntypedActor.scala:97)
   2018-06-07T10:22:16.91-0500 [APP/PROC/WEB/0] ERR     at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
   2018-06-07T10:22:16.91-0500 [APP/PROC/WEB/0] ERR     at akka.actor.ActorCell.invoke(ActorCell.scala:487)
   2018-06-07T10:22:16.91-0500 [APP/PROC/WEB/0] ERR     at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
   2018-06-07T10:22:16.91-0500 [APP/PROC/WEB/0] ERR     at akka.dispatch.Mailbox.run(Mailbox.scala:220)
   2018-06-07T10:22:16.91-0500 [APP/PROC/WEB/0] ERR     at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
   2018-06-07T10:22:16.91-0500 [APP/PROC/WEB/0] OUT 2018-06-07 15:22:16 ERROR application:124 - Error Details:
   2018-06-07T10:22:16.92-0500 [APP/PROC/WEB/0] OUT 2018-06-07 15:22:16 ERROR application:140 - ETL job (jobName:usmgen-oracle-ETL, whEtlExecId:3) got a problem
   2018-06-07T10:22:16.92-0500 [APP/PROC/WEB/0] ERR     at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
   2018-06-07T10:22:16.92-0500 [APP/PROC/WEB/0] ERR     at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
   2018-06-07T10:22:16.92-0500 [APP/PROC/WEB/0] ERR     at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
   2018-06-07T10:22:16.92-0500 [APP/PROC/WEB/0] ERR     at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
mtHuberty commented 6 years ago

I'm making some progress, located my log file for this job at /application/logs/etl/usmgen-oracle-ETL/3/log.

First line reveals my error: java.lang.UnsatisfiedLinkError: no ocijdbc12 in java.library.path

I do have the drivers, but I'm betting I need to set the $CLASSPATH to include their locations. Just thought I would share in case my debugging process helps you!

ankurgadgilwar commented 6 years ago

@mtHuberty as we discussed earlier, I am trying to run the HIVE_METADATA_ETL job, which has no option to create or store a log file, so checking the logs os almost a no-no for me. However, I am getting the following error in the HIVE_METADATA_ETL.stderr file: Error: Could not find or load main class metadata.etl.Launcher. Now this is entirely new and also after updating the Java Classpath in the ConfigUtil.java file, I am getting a NullPointerException error.

java.lang.Exception: Process + 747770 failed at akka.actor.UntypedActor$$anonfun$receive$1.applyOrElse(UntypedActor.scala:167) at akka.actor.Actor$class.aroundReceive(Actor.scala:467) at actors.EtlJobActor.onReceive(EtlJobActor.java:123) at akka.actor.UntypedActor.aroundReceive(UntypedActor.scala:97) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516) at akka.actor.ActorCell.invoke(ActorCell.scala:487) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238) at akka.dispatch.Mailbox.run(Mailbox.scala:220) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) 03:10:07.811 [QUIET] [system.out] 03:10:07.811 [QUIET] [system.out] 2018-06-07 03:10:07 ERROR application:138 - ETL job (jobName:HIVE_METADATA_ETL, whEtlExecId:8) got a problem

I just went through the open issues, there are many people who are facing similar issues, the contributors on this project don't at all seem very responsive, I guess this might take a while to get resolved.

mtHuberty commented 6 years ago

Just to try it, I added a HIVE_METADATA_ETL.job to my /wherehows-backend/jobs/ directory, and I was able to locate a log file that it created at /application/logs/etl/HIVE_METADATA_ETL/7/log after it tried to run and failed. (Mine failed because I don't actually have a HIVE DB to connect to, I was just doing it to try and find a log file location for you). I assume you've already checked that location, but in case you haven't, maybe it's there.

Edit: I feel like the /7/ portion of that path could be variable, so maybe just peek at /application/logs/etl/HIVE_METADATA_ETL/

ankurgadgilwar commented 6 years ago

@mtHuberty Thank you so much man, I will give this a try! Also, could you confirm if the error was a single lined error?

mtHuberty commented 6 years ago

The log file contained at least 60 lines of errors. Yours should be different, but it very likely will contain multiple lines of error code, and not just a single line.

ankurgadgilwar commented 6 years ago

Have you specified the above-mentioned path somewhere in your application.env or application.conf file? Because I don't even have a /application directory to start with. There are logs under /WhereHows/wherehows-backend/logs/application.log but that just contains [INFO] from play.core.server.NettyServer in main - Listening for HTTP on /0:0:0:0:0:0:0:0:19000, that's all. I think I am missing something really silly.

What I am getting now is : 2018-06-08 13:04:35 ERROR application:138 - ETL job (jobName:HIVE_METADATA_ETL, whEtlExecId:68) got a problem [ERROR] [06/08/2018 13:04:35.652] [WhereHowsETLService-akka.actor.default-dispatcher-5] [akka://WhereHowsETLService/user/EtlJobActor] null java.lang.NullPointerException at actors.EtlJobActor.onReceive(EtlJobActor.java:139) at akka.actor.UntypedActor$$anonfun$receive$1.applyOrElse(UntypedActor.scala:167) at akka.actor.Actor$class.aroundReceive(Actor.scala:467) at akka.actor.UntypedActor.aroundReceive(UntypedActor.scala:97) at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516) at akka.actor.ActorCell.invoke(ActorCell.scala:487) at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238) at akka.dispatch.Mailbox.run(Mailbox.scala:220) at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397) at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) Are you using docker to run the application?

mtHuberty commented 6 years ago

Yes. I'm using docker, and I'm running tag 1.0.0.

I'm actually running one version of the application in docker containers in Cloud Foundry, and another version in docker containers in an AWS EC2 instance. (Long story, I need both of them at the moment). What about you? Docker as well? In what environment?

ankurgadgilwar commented 6 years ago

No docker but the tag is same. Is it imperative that I build it first and then run the backend playBinary? The nullPointerException says tjat it is hitting somwhere where it is expecting something and not getting. I don't know what.

mtHuberty commented 6 years ago

I believe so. The instructions I follow involve running the build.sh script in /WhereHows/wherehows-docker, which first builds it and then creates the docker files, one of which then runs the backend playBinary.

I'm not entirely sure though.

cgf120 commented 6 years ago

Include URL with "" ,plse try it

ankurgadgilwar commented 6 years ago

@mtHuberty do you mean that I should include the inverted commas while typing in the URL on my browser?? Can you give me an example please? This has been pending for me for a really long time now.

mtHuberty commented 6 years ago

Sorry, I'm not sure what @cgf120 is referring to about a URL

ankurgadgilwar commented 6 years ago

Was able to get through the issues by simply using the docker and docker-compose approach. The datasets are indeed visible on the UI. Couldn't share the screenshot, but it has worked just fine. make sure you have CentOS 7.x and above.

ankurgadgilwar commented 6 years ago

Summary: I had to merge different components from different branches to make everything run. Detailed events: Created the HIVE_METADATA_ETL.job file using the template given by WhereHows. I started with using the master branch. After completing everything, I faced the same issue that the jobs were not getting scheduled and some issues with the frontend as well. In the WhereHows community, a few committers have posted that tag 1.0.0 is stable. So next, I pulled the branch v1.0.0 and tried building the Docker containers. The build failed as the bower version used by v1.0.0 was incompatible with the current release. So, I replaced all the bower related files from the v1.0.0 branch with the files from the master branch. The build was successful. Then, I started the docker containers, but still no help. I checked the logs and saw that the job is now getting scheduled but the Datasets were not getting populated. Then, I noticed that in the branch v1.0.0 only 3 containers are running (frontend, backend and mysql) and no elastic search container which was there in the master branch. Next, I made the required changes in the environment file, docker-compose.yml file and added the elastic-search docker files too. Did "docker-compose up" and started the containers. Everything was now running good and the datasets started to populate.

ankurgadgilwar commented 6 years ago

I have a detailed document created for an end to end solution to use the docker option to get this running. I'll check with my peers if I can upload it here, once I get a go, I'll upload it.