Open iamrjt04 opened 5 years ago
In this class spark-atlas-connector/spark-atlas-connector/src/main/scala/com/hortonworks/spark/atlas/AtlasClientConf.scala default password and username is - val CLIENT_USERNAME = ConfigEntry("atlas.client.username", "admin") val CLIENT_PASSWORD = ConfigEntry("atlas.client.password", "admin123") due to this I was facing an issue as the user configurations on my system do not match these credentials. In regard to this one can pass a atlas-application.properties(property file) as mentioned in ReadMe, but even after passing the above mentioned file I was facing the same issue.
@iamrjt04 I also got 401 first but adding these into atlas-application.properties
fixed it:
atlas.client.type=rest
atlas.client.username=admin
atlas.client.password=admin
Also in your case, maybe atlas user conf has the password admin
, which is thus required, instead of the admin123
fallback of scala code.
I also modified atlas.rest.address
to be something else than http://localhost:21000
because atlas is on another machine, but probably that doesn't apply to you.
Edit: looks like you found it, I missed that you had opened #276.
I'm using Atlas - 1.0.0 and Spark 2.3.0 on EMR - 5.14.0 I'm getting this error while executing following commands:-
>spark-shell --jars spark-atlas-connector-assembly_2.11-0.1.0-SNAPSHOT.jar --conf spark.extraListeners=com.hortonworks.spark.atlas.SparkAtlasEventTracker --conf spark.sql.queryExecutionListeners=com.hortonworks.spark.atlas.SparkAtlasEventTracker --files atlas-application.properties
ERROR
In above command I'm getting the error.
19/07/16 12:24:23 INFO AtlasBaseClient: Client has only one service URL, will use that for all actions: http://localhost:21000 19/07/16 12:24:23 INFO AtlasBaseClient: method=GET path=api/atlas/v2/types/typedefs/ contentType=application/json; charset=UTF-8 accept=application/json status=401 19/07/16 12:24:23 ERROR SparkAtlasEventTracker: Fail to initialize Atlas client, stop this listener org.apache.atlas.AtlasServiceException: Metadata service API org.apache.atlas.AtlasClientV2$API_V2@2351e74d failed with status 401 (Unauthorized) Response Body () at org.apache.atlas.AtlasBaseClient.callAPIWithResource(AtlasBaseClient.java:395) at org.apache.atlas.AtlasBaseClient.callAPIWithResource(AtlasBaseClient.java:323) at org.apache.atlas.AtlasBaseClient.callAPI(AtlasBaseClient.java:239) at org.apache.atlas.AtlasClientV2.getAllTypeDefs(AtlasClientV2.java:124) at com.hortonworks.spark.atlas.RestAtlasClient.getAtlasTypeDefs(RestAtlasClient.scala:58) at com.hortonworks.spark.atlas.types.SparkAtlasModel$$anonfun$checkAndGroupTypes$1.apply(SparkAtlasModel.scala:104) at com.hortonworks.spark.atlas.types.SparkAtlasModel$$anonfun$checkAndGroupTypes$1.apply(SparkAtlasModel.scala:101) at scala.collection.immutable.HashMap$HashMap1.foreach(HashMap.scala:221) at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:428) at scala.collection.immutable.HashMap$HashTrieMap.foreach(HashMap.scala:428) at com.hortonworks.spark.atlas.types.SparkAtlasModel$.checkAndGroupTypes(SparkAtlasModel.scala:101) at com.hortonworks.spark.atlas.types.SparkAtlasModel$.checkAndCreateTypes(SparkAtlasModel.scala:68) at com.hortonworks.spark.atlas.SparkAtlasEventTracker.initializeSparkModel(SparkAtlasEventTracker.scala:108) at com.hortonworks.spark.atlas.SparkAtlasEventTracker.(SparkAtlasEventTracker.scala:48)
at com.hortonworks.spark.atlas.SparkAtlasEventTracker.(SparkAtlasEventTracker.scala:39)
at com.hortonworks.spark.atlas.SparkAtlasEventTracker.(SparkAtlasEventTracker.scala:43)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2743)
at org.apache.spark.util.Utils$$anonfun$loadExtensions$1.apply(Utils.scala:2732)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:74)
at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
at scala.collection.AbstractTraversable.flatMap(Traversable.scala:104)
at org.apache.spark.util.Utils$.loadExtensions(Utils.scala:2732)
at org.apache.spark.SparkContext$$anonfun$setupAndStartListenerBus$1.apply(SparkContext.scala:2353)
at org.apache.spark.SparkContext$$anonfun$setupAndStartListenerBus$1.apply(SparkContext.scala:2352)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.SparkContext.setupAndStartListenerBus(SparkContext.scala:2352)
at org.apache.spark.SparkContext.(SparkContext.scala:553)
at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:58)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:238)
at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
at py4j.GatewayConnection.run(GatewayConnection.java:214)
at java.lang.Thread.run(Thread.java:748)