apache / dolphinscheduler

Apache DolphinScheduler is the modern data orchestration platform. Agile to create high performance workflow with low-code
https://dolphinscheduler.apache.org/
Apache License 2.0
12.93k stars 4.64k forks source link

[Question]kerberos Authentication failed #3813

Closed shiliquan closed 3 years ago

shiliquan commented 4 years ago

[INFO] 2020-09-24 17:17:54.245 - [taskAppId=TASK-6-21-23]:[121] - -> Connecting to jdbc:hive2://gdlt-b-master01.cnbdcu.com:2181,gdlt-b-master02.cnbdcu.com:2181,gdlt-b-master03.cnbdcu.com:2181,gdlt-b-master04.cnbdcu.com:2181,gdlt-b-master05.cnbdcu.com:2181/odm;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2?tez.queue.name=public 20/09/24 17:17:53 INFO Utils: Supplied authorities: gdlt-b-master01.cnbdcu.com:2181,gdlt-b-master02.cnbdcu.com:2181,gdlt-b-master03.cnbdcu.com:2181,gdlt-b-master04.cnbdcu.com:2181,gdlt-b-master05.cnbdcu.com:2181 20/09/24 17:17:53 INFO CuratorFrameworkImpl: Starting 20/09/24 17:17:53 INFO ZooKeeper: Client environment:zookeeper.version=3.4.6-315--1, built on 08/23/2019 04:37 GMT 20/09/24 17:17:53 INFO ZooKeeper: Client environment:host.name=hdpv3test03.cnbdcu.com 20/09/24 17:17:53 INFO ZooKeeper: Client environment:java.version=1.8.0_202 20/09/24 17:17:53 INFO ZooKeeper: Client environment:java.vendor=Oracle Corporation 20/09/24 17:17:53 INFO ZooKeeper: Client environment:java.home=/usr/jdk/jre 20/09/24 17:17:53 INFO ZooKeeper: Client environment:java.class.path=/usr/hdp/3.1.4.0-315/spark2/conf/:/usr/hdp/3.1.4.0-315/spark2/jars/kerb-util-1.0.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/super-csv-2.2.0.jar:/usr/hdp/3.1.4.0-315/spark2/jars/commons-net-2.2.jar:/usr/hdp/3.1.4.0-315/spark2/jars/libthrift-0.12.0.jar:/usr/hdp/3.1.4.0-315/spark2/jars/antlr-runtime-3.4.jar:/usr/hdp/3.1.4.0-315/spark2/jars/flogger-0.3.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/commons-io-2.4.jar:/usr/hdp/3.1.4.0-315/spark2/jars/univocity-parsers-2.5.9.jar:/usr/hdp/3.1.4.0-315/spark2/jars/avro-mapred-1.7.7-hadoop2.jar:/usr/hdp/3.1.4.0-315/spark2/jars/stream-2.7.0.jar:/usr/hdp/3.1.4.0-315/spark2/jars/kerb-crypto-1.0.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/javax.servlet-api-3.1.0.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jersey-container-servlet-2.22.2.jar:/usr/hdp/3.1.4.0-315/spark2/jars/orc-core-1.4.4-nohive.jar:/usr/hdp/3.1.4.0-315/spark2/jars/spark-hive-thriftserver_2.11-2.3.2.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/javassist-3.18.1-GA.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jsr305-1.3.9.jar:/usr/hdp/3.1.4.0-315/spark2/jars/spark-launcher_2.11-2.3.2.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hadoop-auth-3.1.1.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/spark-sketch_2.11-2.3.2.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/aircompressor-0.8.jar:/usr/hdp/3.1.4.0-315/spark2/jars/gson-2.2.4.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hk2-utils-2.4.0-b34.jar:/usr/hdp/3.1.4.0-315/spark2/jars/re2j-1.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/commons-configuration2-2.1.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/kerb-client-1.0.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hadoop-yarn-common-3.1.1.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hadoop-yarn-server-web-proxy-3.1.1.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/commons-logging-1.1.3.jar:/usr/hdp/3.1.4.0-315/spark2/jars/breeze_2.11-0.13.2.jar:/usr/hdp/3.1.4.0-315/spark2/jars/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jackson-module-scala_2.11-2.9.9.jar:/usr/hdp/3.1.4.0-315/spark2/jars/avro-1.7.7.jar:/usr/hdp/3.1.4.0-315/spark2/jars/eigenbase-properties-1.1.5.jar:/usr/hdp/3.1.4.0-315/spark2/jars/snappy-0.2.jar:/usr/hdp/3.1.4.0-315/spark2/jars/flogger-system-backend-0.3.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/azure-storage-7.0.0.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hk2-api-2.4.0-b34.jar:/usr/hdp/3.1.4.0-315/spark2/jars/apache-log4j-extras-1.2.17.jar:/usr/hdp/3.1.4.0-315/spark2/jars/arrow-vector-0.8.0.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/curator-framework-2.12.0.jar:/usr/hdp/3.1.4.0-315/spark2/jars/wildfly-openssl-1.0.4.Final.jar:/usr/hdp/3.1.4.0-315/spark2/jars/macro-compat_2.11-1.1.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/spark-mllib_2.11-2.3.2.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/datanucleus-rdbms-4.1.7.jar:/usr/hdp/3.1.4.0-315/spark2/jars/chill_2.11-0.8.4.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jline-2.14.3.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jackson-mapper-asl-1.9.13.jar:/usr/hdp/3.1.4.0-315/spark2/jars/javax.inject-2.4.0-b34.jar:/usr/hdp/3.1.4.0-315/spark2/jars/flatbuffers-1.2.0-3f79e055.jar:/usr/hdp/3.1.4.0-315/spark2/jars/breeze-macros_2.11-0.13.2.jar:/usr/hdp/3.1.4.0-315/spark2/jars/machinist_2.11-0.6.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/kerb-core-1.0.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jackson-jaxrs-json-provider-2.9.9.jar:/usr/hdp/3.1.4.0-315/spark2/jars/spark-network-shuffle_2.11-2.3.2.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hppc-0.7.2.jar:/usr/hdp/3.1.4.0-315/spark2/jars/spire-macros_2.11-0.13.0.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jcl-over-slf4j-1.7.16.jar:/usr/hdp/3.1.4.0-315/spark2/jars/osgi-resource-locator-1.0.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/calcite-core-1.2.0-incubating.jar:/usr/hdp/3.1.4.0-315/spark2/jars/javolution-5.5.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/checker-qual-2.8.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/animal-sniffer-annotations-1.17.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hive-cli-1.21.2.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jtransforms-2.4.0.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jetty-util-ajax-9.3.24.v20180605.jar:/usr/hdp/3.1.4.0-315/spark2/jars/kerb-simplekdc-1.0.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/spark-kvstore_2.11-2.3.2.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/metrics-core-3.1.5.jar:/usr/hdp/3.1.4.0-315/spark2/jars/metrics-json-3.1.5.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jackson-databind-2.9.9.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/error_prone_annotations-2.3.2.jar:/usr/hdp/3.1.4.0-315/spark2/jars/metrics-jvm-3.1.5.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jul-to-slf4j-1.7.16.jar:/usr/hdp/3.1.4.0-315/spark2/jars/py4j-0.10.7.jar:/usr/hdp/3.1.4.0-315/spark2/jars/JavaEWAH-0.3.2.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hadoop-openstack-3.1.1.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/spark-sql_2.11-2.3.2.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jackson-core-2.9.9.jar:/usr/hdp/3.1.4.0-315/spark2/jars/arpack_combined_all-0.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/shapeless_2.11-2.3.2.jar:/usr/hdp/3.1.4.0-315/spark2/jars/spark-tags_2.11-2.3.2.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/kerby-util-1.0.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/aopalliance-repackaged-2.4.0-b34.jar:/usr/hdp/3.1.4.0-315/spark2/jars/ehcache-3.3.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/flogger-log4j-backend-0.3.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/datanucleus-core-4.1.6.jar:/usr/hdp/3.1.4.0-315/spark2/jars/joda-time-2.9.3.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jsp-api-2.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jersey-container-servlet-core-2.22.2.jar:/usr/hdp/3.1.4.0-315/spark2/jars/kerb-common-1.0.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/guava-28.0-jre.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hadoop-yarn-server-common-3.1.1.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/gcs-connector-1.9.10.3.1.4.0-315-shaded.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jersey-guava-2.22.2.jar:/usr/hdp/3.1.4.0-315/spark2/jars/token-provider-1.0.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/core-1.1.2.jar:/usr/hdp/3.1.4.0-315/spark2/jars/zookeeper-3.4.6.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hadoop-aws-3.1.1.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/antlr-2.7.7.jar:/usr/hdp/3.1.4.0-315/spark2/jars/commons-codec-1.10.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hadoop-azure-3.1.1.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jersey-client-2.22.2.jar:/usr/hdp/3.1.4.0-315/spark2/jars/scalap-2.11.12.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hadoop-mapreduce-client-jobclient-3.1.1.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/bonecp-0.8.0.RELEASE.jar:/usr/hdp/3.1.4.0-315/spark2/jars/spark-hive_2.11-2.3.2.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/parquet-common-1.8.3.jar:/usr/hdp/3.1.4.0-315/spark2/jars/parquet-column-1.8.3.jar:/usr/hdp/3.1.4.0-315/spark2/jars/objenesis-2.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/calcite-linq4j-1.2.0-incubating.jar:/usr/hdp/3.1.4.0-315/spark2/jars/dnsjava-2.1.7.jar:/usr/hdp/3.1.4.0-315/spark2/jars/okhttp-2.7.5.jar:/usr/hdp/3.1.4.0-315/spark2/jars/spark-yarn_2.11-2.3.2.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/arrow-memory-0.8.0.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/spark-network-common_2.11-2.3.2.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/libfb303-0.9.3.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jetty-xml-9.3.24.v20180605.jar:/usr/hdp/3.1.4.0-315/spark2/jars/orc-mapreduce-1.4.4-nohive.jar:/usr/hdp/3.1.4.0-315/spark2/jars/kerb-admin-1.0.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/spark-hadoop-cloud_2.11-2.3.2.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/woodstox-core-5.0.3.jar:/usr/hdp/3.1.4.0-315/spark2/jars/RoaringBitmap-0.5.11.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jetty-webapp-9.3.24.v20180605.jar:/usr/hdp/3.1.4.0-315/spark2/jars/google-extensions-0.3.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/opencsv-2.3.jar:/usr/hdp/3.1.4.0-315/spark2/jars/kerb-identity-1.0.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/calcite-avatica-1.2.0-incubating.jar:/usr/hdp/3.1.4.0-315/spark2/jars/commons-httpclient-3.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/commons-cli-1.2.jar:/usr/hdp/3.1.4.0-315/spark2/jars/scala-xml_2.11-1.0.5.jar:/usr/hdp/3.1.4.0-315/spark2/jars/ivy-2.4.0.jar:/usr/hdp/3.1.4.0-315/spark2/jars/json4s-jackson_2.11-3.2.11.jar:/usr/hdp/3.1.4.0-315/spark2/jars/arrow-format-0.8.0.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hadoop-mapreduce-client-core-3.1.1.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/azure-data-lake-store-sdk-2.3.3.jar:/usr/hdp/3.1.4.0-315/spark2/jars/kryo-shaded-3.0.3.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jackson-dataformat-cbor-2.9.9.jar:/usr/hdp/3.1.4.0-315/spark2/jars/parquet-encoding-1.8.3.jar:/usr/hdp/3.1.4.0-315/spark2/jars/stax2-api-3.1.4.jar:/usr/hdp/3.1.4.0-315/spark2/jars/netty-3.9.9.Final.jar:/usr/hdp/3.1.4.0-315/spark2/jars/netty-all-4.1.17.Final.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jetty-util-9.3.24.v20180605.jar:/usr/hdp/3.1.4.0-315/spark2/jars/aopalliance-1.0.jar:/usr/hdp/3.1.4.0-315/spark2/jars/chill-java-0.8.4.jar:/usr/hdp/3.1.4.0-315/spark2/jars/spark-unsafe_2.11-2.3.2.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hadoop-yarn-api-3.1.1.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/javax.ws.rs-api-2.0.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/transaction-api-1.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/slf4j-api-1.7.16.jar:/usr/hdp/3.1.4.0-315/spark2/jars/commons-compress-1.4.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jcip-annotations-1.0-1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/spark-graphx_2.11-2.3.2.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/guice-servlet-4.0.jar:/usr/hdp/3.1.4.0-315/spark2/jars/commons-compiler-3.0.8.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hadoop-azure-datalake-3.1.1.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/json4s-ast_2.11-3.2.11.jar:/usr/hdp/3.1.4.0-315/spark2/jars/commons-pool-1.5.4.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jta-1.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/kerby-config-1.0.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/spark-catalyst_2.11-2.3.2.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/kerby-xdr-1.0.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jackson-module-jaxb-annotations-2.9.9.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hive-beeline-1.21.2.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/accessors-smart-1.2.jar:/usr/hdp/3.1.4.0-315/spark2/jars/scala-compiler-2.11.12.jar:/usr/hdp/3.1.4.0-315/spark2/jars/guice-4.0.jar:/usr/hdp/3.1.4.0-315/spark2/jars/spire_2.11-0.13.0.jar:/usr/hdp/3.1.4.0-315/spark2/jars/antlr4-runtime-4.7.jar:/usr/hdp/3.1.4.0-315/spark2/jars/zstd-jni-1.3.2-2.jar:/usr/hdp/3.1.4.0-315/spark2/jars/j2objc-annotations-1.3.jar:/usr/hdp/3.1.4.0-315/spark2/jars/janino-3.0.8.jar:/usr/hdp/3.1.4.0-315/spark2/jars/nimbus-jose-jwt-4.41.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jaxb-api-2.2.11.jar:/usr/hdp/3.1.4.0-315/spark2/jars/scala-reflect-2.11.12.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hadoop-annotations-3.1.1.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/activation-1.1.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/curator-recipes-2.12.0.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hive-exec-1.21.2.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/HikariCP-java7-2.4.12.jar:/usr/hdp/3.1.4.0-315/spark2/jars/avro-ipc-1.7.7.jar:/usr/hdp/3.1.4.0-315/spark2/jars/stringtemplate-3.2.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/commons-daemon-1.0.13.jar:/usr/hdp/3.1.4.0-315/spark2/jars/mssql-jdbc-6.2.1.jre7.jar:/usr/hdp/3.1.4.0-315/spark2/jars/commons-math3-3.4.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/okio-1.6.0.jar:/usr/hdp/3.1.4.0-315/spark2/jars/compress-lzf-1.0.3.jar:/usr/hdp/3.1.4.0-315/spark2/jars/parquet-format-2.3.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/parquet-hadoop-bundle-1.6.0.jar:/usr/hdp/3.1.4.0-315/spark2/jars/commons-lang3-3.5.jar:/usr/hdp/3.1.4.0-315/spark2/jars/commons-lang-2.6.jar:/usr/hdp/3.1.4.0-315/spark2/jars/commons-beanutils-1.9.3.jar:/usr/hdp/3.1.4.0-315/spark2/jars/aws-java-sdk-bundle-1.11.375.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hadoop-hdfs-client-3.1.1.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/slf4j-log4j12-1.7.16.jar:/usr/hdp/3.1.4.0-315/spark2/jars/log4j-1.2.17.jar:/usr/hdp/3.1.4.0-315/spark2/jars/HikariCP-2.5.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/javax.jdo-3.2.0-m3.jar:/usr/hdp/3.1.4.0-315/spark2/jars/parquet-hadoop-1.8.3.jar:/usr/hdp/3.1.4.0-315/spark2/jars/derby-10.12.1.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/spark-streaming_2.11-2.3.2.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jersey-media-jaxb-2.22.2.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jpam-1.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/ST4-4.0.4.jar:/usr/hdp/3.1.4.0-315/spark2/jars/scala-library-2.11.12.jar:/usr/hdp/3.1.4.0-315/spark2/jars/oro-2.0.8.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hive-metastore-1.21.2.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/leveldbjni-all-1.8.jar:/usr/hdp/3.1.4.0-315/spark2/jars/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hive-jdbc-1.21.2.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/spark-mllib-local_2.11-2.3.2.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jackson-core-asl-1.9.13.jar:/usr/hdp/3.1.4.0-315/spark2/jars/paranamer-2.8.jar:/usr/hdp/3.1.4.0-315/spark2/jars/bcprov-jdk15on-1.60.jar:/usr/hdp/3.1.4.0-315/spark2/jars/metrics-graphite-3.1.5.jar:/usr/hdp/3.1.4.0-315/spark2/jars/kerb-server-1.0.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/javax.inject-1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hadoop-cloud-storage-3.1.1.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hadoop-yarn-client-3.1.1.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/lz4-java-1.4.0.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jackson-jaxrs-base-2.9.9.jar:/usr/hdp/3.1.4.0-315/spark2/jars/curator-client-2.12.0.jar:/usr/hdp/3.1.4.0-315/spark2/jars/pyrolite-4.13.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jackson-module-paranamer-2.9.9.jar:/usr/hdp/3.1.4.0-315/spark2/jars/validation-api-1.1.0.Final.jar:/usr/hdp/3.1.4.0-315/spark2/jars/commons-dbcp-1.4.jar:/usr/hdp/3.1.4.0-315/spark2/jars/datanucleus-api-jdo-4.2.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/failureaccess-1.0.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/kerby-pkix-1.0.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jersey-common-2.22.2.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hadoop-yarn-registry-3.1.1.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hadoop-client-3.1.1.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/json-smart-2.3.jar:/usr/hdp/3.1.4.0-315/spark2/jars/xz-1.0.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hk2-locator-2.4.0-b34.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jackson-annotations-2.9.9.jar:/usr/hdp/3.1.4.0-315/spark2/jars/spark-core_2.11-2.3.2.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/snappy-java-1.1.2.6.jar:/usr/hdp/3.1.4.0-315/spark2/jars/xbean-asm5-shaded-4.4.jar:/usr/hdp/3.1.4.0-315/spark2/jars/httpclient-4.5.4.jar:/usr/hdp/3.1.4.0-315/spark2/jars/scala-parser-combinators_2.11-1.1.0.jar:/usr/hdp/3.1.4.0-315/spark2/jars/httpcore-4.4.8.jar:/usr/hdp/3.1.4.0-315/spark2/jars/javax.annotation-api-1.2.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hadoop-mapreduce-client-common-3.1.1.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/parquet-jackson-1.8.3.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jodd-core-3.5.2.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jdo-api-3.0.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/bcpkix-jdk15on-1.60.jar:/usr/hdp/3.1.4.0-315/spark2/jars/jersey-server-2.22.2.jar:/usr/hdp/3.1.4.0-315/spark2/jars/hadoop-common-3.1.1.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/protobuf-java-2.5.0.jar:/usr/hdp/3.1.4.0-315/spark2/jars/azure-keyvault-core-1.0.0.jar:/usr/hdp/3.1.4.0-315/spark2/jars/htrace-core4-4.1.0-incubating.jar:/usr/hdp/3.1.4.0-315/spark2/jars/stax-api-1.0.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/kerby-asn1-1.0.1.jar:/usr/hdp/3.1.4.0-315/spark2/jars/spark-repl_2.11-2.3.2.3.1.4.0-315.jar:/usr/hdp/3.1.4.0-315/spark2/jars/json4s-core_2.11-3.2.11.jar:/usr/hdp/3.1.4.0-315/spark2/jars/commons-crypto-1.0.0.jar:/usr/hdp/3.1.4.0-315/spark2/jars/minlog-1.3.0.jar:/usr/hdp/3.1.4.0-315/spark2/jars/commons-collections-3.2.2.jar:/etc/hadoop/conf/ 20/09/24 17:17:53 INFO ZooKeeper: Client environment:java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib 20/09/24 17:17:53 INFO ZooKeeper: Client environment:java.io.tmpdir=/tmp 20/09/24 17:17:53 INFO ZooKeeper: Client environment:java.compiler= 20/09/24 17:17:53 INFO ZooKeeper: Client environment:os.name=Linux 20/09/24 17:17:53 INFO ZooKeeper: Client environment:os.arch=amd64 20/09/24 17:17:53 INFO ZooKeeper: Client environment:os.version=3.10.0-693.el7.x86_64 20/09/24 17:17:53 INFO ZooKeeper: Client environment:user.name=dam_test 20/09/24 17:17:53 INFO ZooKeeper: Client environment:user.home=/home/dam_test 20/09/24 17:17:53 INFO ZooKeeper: Client environment:user.dir=/tmp/dolphinscheduler/exec/process/1/6/21/23 20/09/24 17:17:53 INFO ZooKeeper: Initiating client connection, connectString=gdlt-b-master01.cnbdcu.com:2181,gdlt-b-master02.cnbdcu.com:2181,gdlt-b-master03.cnbdcu.com:2181,gdlt-b-master04.cnbdcu.com:2181,gdlt-b-master05.cnbdcu.com:2181 sessionTimeout=60000 watcher=org.apache.curator.ConnectionState@7ff2a664 20/09/24 17:17:53 INFO ClientCnxn: Opening socket connection to server gdlt-b-master01.cnbdcu.com/172.17.8.184:2181. Will not attempt to authenticate using SASL (unknown error) 20/09/24 17:17:53 INFO ClientCnxn: Socket connection established, initiating session, client: /172.17.9.41:46234, server: gdlt-b-master01.cnbdcu.com/172.17.8.184:2181 20/09/24 17:17:53 INFO ClientCnxn: Session establishment complete on server gdlt-b-master01.cnbdcu.com/172.17.8.184:2181, sessionid = 0x172d9195d1ede37, negotiated timeout = 60000 20/09/24 17:17:53 INFO ConnectionStateManager: State change: CONNECTED 20/09/24 17:17:53 INFO CuratorFrameworkImpl: backgroundOperationsLoop exiting 20/09/24 17:17:53 INFO ZooKeeper: Session: 0x172d9195d1ede37 closed 20/09/24 17:17:53 INFO ClientCnxn: EventThread shut down 20/09/24 17:17:53 INFO Utils: Resolved authority: gdlt-b-master05.cnbdcu.com:10000 20/09/24 17:17:54 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 20/09/24 17:17:54 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://gdlt-b-master05.cnbdcu.com:10000/odm;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2?tez.queue.name=public [INFO] 2020-09-24 17:17:54.274 - [taskAppId=TASK-6-21-23]:[121] - -> 20/09/24 17:17:54 ERROR TSaslTransport: SASL negotiation failure javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:95) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:38) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:206) at org.apache.hive.jdbc.HiveConnection.(HiveConnection.java:178) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:208) at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:142) at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:207) at org.apache.hive.beeline.Commands.connect(Commands.java:1149) at org.apache.hive.beeline.Commands.connect(Commands.java:1070) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:54) at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:971) at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:707) at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:757) at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:484) at org.apache.hive.beeline.BeeLine.main(BeeLine.java:467) Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt) at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ... 28 more 20/09/24 17:17:54 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://gdlt-b-master05.cnbdcu.com:10000/odm;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2?tez.queue.name=public 20/09/24 17:17:54 INFO CuratorFrameworkImpl: Starting 20/09/24 17:17:54 INFO ZooKeeper: Initiating client connection, connectString=gdlt-b-master01.cnbdcu.com:2181,gdlt-b-master02.cnbdcu.com:2181,gdlt-b-master03.cnbdcu.com:2181,gdlt-b-master04.cnbdcu.com:2181,gdlt-b-master05.cnbdcu.com:2181 sessionTimeout=60000 watcher=org.apache.curator.ConnectionState@3738449f 20/09/24 17:17:54 INFO ClientCnxn: Opening socket connection to server gdlt-b-master02.cnbdcu.com/172.17.8.185:2181. Will not attempt to authenticate using SASL (unknown error) 20/09/24 17:17:54 INFO ClientCnxn: Socket connection established, initiating session, client: /172.17.9.41:43186, server: gdlt-b-master02.cnbdcu.com/172.17.8.185:2181 20/09/24 17:17:54 INFO ClientCnxn: Session establishment complete on server gdlt-b-master02.cnbdcu.com/172.17.8.185:2181, sessionid = 0x274444f679746db, negotiated timeout = 60000 20/09/24 17:17:54 INFO ConnectionStateManager: State change: CONNECTED 20/09/24 17:17:54 INFO CuratorFrameworkImpl: backgroundOperationsLoop exiting 20/09/24 17:17:54 INFO ZooKeeper: Session: 0x274444f679746db closed 20/09/24 17:17:54 INFO ClientCnxn: EventThread shut down 20/09/24 17:17:54 INFO Utils: Selected HiveServer2 instance with uri: jdbc:hive2://gdlt-b-master05.cnbdcu.com:10000/odm;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2?tez.queue.name=public 20/09/24 17:17:54 INFO HiveConnection: Will retry opening client transport 20/09/24 17:17:54 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://gdlt-b-master05.cnbdcu.com:10000/odm;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2?tez.queue.name=public 20/09/24 17:17:54 ERROR TSaslTransport: SASL negotiation failure javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:95) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:38) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) [INFO] 2020-09-24 17:17:54.290 - [taskAppId=TASK-6-21-23]:[121] - -> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:206) at org.apache.hive.jdbc.HiveConnection.(HiveConnection.java:178) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:208) at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:142) at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:207) at org.apache.hive.beeline.Commands.connect(Commands.java:1149) at org.apache.hive.beeline.Commands.connect(Commands.java:1070) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:54) at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:971) at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:707) at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:757) at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:484) at org.apache.hive.beeline.BeeLine.main(BeeLine.java:467) Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt) at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ... 28 more 20/09/24 17:17:54 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://gdlt-b-master05.cnbdcu.com:10000/odm;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2?tez.queue.name=public 20/09/24 17:17:54 INFO CuratorFrameworkImpl: Starting 20/09/24 17:17:54 INFO ZooKeeper: Initiating client connection, connectString=gdlt-b-master01.cnbdcu.com:2181,gdlt-b-master02.cnbdcu.com:2181,gdlt-b-master03.cnbdcu.com:2181,gdlt-b-master04.cnbdcu.com:2181,gdlt-b-master05.cnbdcu.com:2181 sessionTimeout=60000 watcher=org.apache.curator.ConnectionState@a3d8174 20/09/24 17:17:54 INFO ClientCnxn: Opening socket connection to server gdlt-b-master02.cnbdcu.com/172.17.8.185:2181. Will not attempt to authenticate using SASL (unknown error) 20/09/24 17:17:54 INFO ClientCnxn: Socket connection established, initiating session, client: /172.17.9.41:43190, server: gdlt-b-master02.cnbdcu.com/172.17.8.185:2181 20/09/24 17:17:54 INFO ClientCnxn: Session establishment complete on server gdlt-b-master02.cnbdcu.com/172.17.8.185:2181, sessionid = 0x274444f679746dc, negotiated timeout = 60000 20/09/24 17:17:54 INFO ConnectionStateManager: State change: CONNECTED 20/09/24 17:17:54 INFO CuratorFrameworkImpl: backgroundOperationsLoop exiting 20/09/24 17:17:54 INFO ZooKeeper: Session: 0x274444f679746dc closed 20/09/24 17:17:54 INFO ClientCnxn: EventThread shut down 20/09/24 17:17:54 INFO Utils: Selected HiveServer2 instance with uri: jdbc:hive2://gdlt-b-master05.cnbdcu.com:10000/odm;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2?tez.queue.name=public 20/09/24 17:17:54 INFO HiveConnection: Will retry opening client transport 20/09/24 17:17:54 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://gdlt-b-master05.cnbdcu.com:10000/odm;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2?tez.queue.name=public 20/09/24 17:17:54 ERROR TSaslTransport: SASL negotiation failure javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:95) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:38) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:206) at org.apache.hive.jdbc.HiveConnection.(HiveConnection.java:178) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:208) at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:142) at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:207) at org.apache.hive.beeline.Commands.connect(Commands.java:1149) at org.apache.hive.beeline.Commands.connect(Commands.java:1070) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) [INFO] 2020-09-24 17:17:54.309 - [taskAppId=TASK-6-21-23]:[121] - -> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:54) at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:971) at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:707) at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:757) at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:484) at org.apache.hive.beeline.BeeLine.main(BeeLine.java:467) Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt) at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ... 28 more 20/09/24 17:17:54 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://gdlt-b-master05.cnbdcu.com:10000/odm;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2?tez.queue.name=public 20/09/24 17:17:54 INFO CuratorFrameworkImpl: Starting 20/09/24 17:17:54 INFO ZooKeeper: Initiating client connection, connectString=gdlt-b-master01.cnbdcu.com:2181,gdlt-b-master02.cnbdcu.com:2181,gdlt-b-master03.cnbdcu.com:2181,gdlt-b-master04.cnbdcu.com:2181,gdlt-b-master05.cnbdcu.com:2181 sessionTimeout=60000 watcher=org.apache.curator.ConnectionState@4e096385 20/09/24 17:17:54 INFO ClientCnxn: Opening socket connection to server gdlt-b-master02.cnbdcu.com/172.17.8.185:2181. Will not attempt to authenticate using SASL (unknown error) 20/09/24 17:17:54 INFO ClientCnxn: Socket connection established, initiating session, client: /172.17.9.41:43194, server: gdlt-b-master02.cnbdcu.com/172.17.8.185:2181 20/09/24 17:17:54 INFO ClientCnxn: Session establishment complete on server gdlt-b-master02.cnbdcu.com/172.17.8.185:2181, sessionid = 0x274444f679746dd, negotiated timeout = 60000 20/09/24 17:17:54 INFO ConnectionStateManager: State change: CONNECTED 20/09/24 17:17:54 INFO CuratorFrameworkImpl: backgroundOperationsLoop exiting 20/09/24 17:17:54 INFO ZooKeeper: Session: 0x274444f679746dd closed 20/09/24 17:17:54 INFO ClientCnxn: EventThread shut down Error: Could not open client transport for any of the Server URI's in ZooKeeper: Unable to read HiveServer2 configs from ZooKeeper (state=08S01,code=0) 20/09/24 17:17:54 INFO Utils: Supplied authorities: gdlt-b-master01.cnbdcu.com:2181,gdlt-b-master02.cnbdcu.com:2181,gdlt-b-master03.cnbdcu.com:2181,gdlt-b-master04.cnbdcu.com:2181,gdlt-b-master05.cnbdcu.com:2181 20/09/24 17:17:54 INFO CuratorFrameworkImpl: Starting 20/09/24 17:17:54 INFO ZooKeeper: Initiating client connection, connectString=gdlt-b-master01.cnbdcu.com:2181,gdlt-b-master02.cnbdcu.com:2181,gdlt-b-master03.cnbdcu.com:2181,gdlt-b-master04.cnbdcu.com:2181,gdlt-b-master05.cnbdcu.com:2181 sessionTimeout=60000 watcher=org.apache.curator.ConnectionState@a2431d0 20/09/24 17:17:54 INFO ClientCnxn: Opening socket connection to server gdlt-b-master04.cnbdcu.com/172.17.8.182:2181. Will not attempt to authenticate using SASL (unknown error) 20/09/24 17:17:54 INFO ClientCnxn: Socket connection established, initiating session, client: /172.17.9.41:45646, server: gdlt-b-master04.cnbdcu.com/172.17.8.182:2181 20/09/24 17:17:54 INFO ClientCnxn: Session establishment complete on server gdlt-b-master04.cnbdcu.com/172.17.8.182:2181, sessionid = 0x472d9195dbfe122, negotiated timeout = 60000 20/09/24 17:17:54 INFO ConnectionStateManager: State change: CONNECTED 20/09/24 17:17:54 INFO CuratorFrameworkImpl: backgroundOperationsLoop exiting 20/09/24 17:17:54 INFO ZooKeeper: Session: 0x472d9195dbfe122 closed 20/09/24 17:17:54 INFO ClientCnxn: EventThread shut down 20/09/24 17:17:54 INFO Utils: Resolved authority: gdlt-b-master03.cnbdcu.com:10000 20/09/24 17:17:54 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://gdlt-b-master03.cnbdcu.com:10000/odm;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2?tez.queue.name=public 20/09/24 17:17:54 ERROR TSaslTransport: SASL negotiation failure javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:95) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:38) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:206) at org.apache.hive.jdbc.HiveConnection.(HiveConnection.java:178) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:208) at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:142) at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:207) at org.apache.hive.beeline.BeeLine.assertConnection(BeeLine.java:1171) at org.apache.hive.beeline.Commands.execute(Commands.java:794) at org.apache.hive.beeline.Commands.sql(Commands.java:713) at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:973) [INFO] 2020-09-24 17:17:54.336 - [taskAppId=TASK-6-21-23]:[121] - -> at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:720) at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:757) at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:484) at org.apache.hive.beeline.BeeLine.main(BeeLine.java:467) Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt) at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ... 24 more 20/09/24 17:17:54 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://gdlt-b-master03.cnbdcu.com:10000/odm;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2?tez.queue.name=public 20/09/24 17:17:54 INFO CuratorFrameworkImpl: Starting 20/09/24 17:17:54 INFO ZooKeeper: Initiating client connection, connectString=gdlt-b-master01.cnbdcu.com:2181,gdlt-b-master02.cnbdcu.com:2181,gdlt-b-master03.cnbdcu.com:2181,gdlt-b-master04.cnbdcu.com:2181,gdlt-b-master05.cnbdcu.com:2181 sessionTimeout=60000 watcher=org.apache.curator.ConnectionState@50ad3bc1 20/09/24 17:17:54 INFO ClientCnxn: Opening socket connection to server gdlt-b-master05.cnbdcu.com/172.17.8.186:2181. Will not attempt to authenticate using SASL (unknown error) 20/09/24 17:17:54 INFO ClientCnxn: Socket connection established, initiating session, client: /172.17.9.41:56772, server: gdlt-b-master05.cnbdcu.com/172.17.8.186:2181 20/09/24 17:17:54 INFO ClientCnxn: Session establishment complete on server gdlt-b-master05.cnbdcu.com/172.17.8.186:2181, sessionid = 0x572d96085e1a945, negotiated timeout = 60000 20/09/24 17:17:54 INFO ConnectionStateManager: State change: CONNECTED 20/09/24 17:17:54 INFO CuratorFrameworkImpl: backgroundOperationsLoop exiting 20/09/24 17:17:54 INFO ZooKeeper: Session: 0x572d96085e1a945 closed 20/09/24 17:17:54 INFO ClientCnxn: EventThread shut down 20/09/24 17:17:54 INFO Utils: Selected HiveServer2 instance with uri: jdbc:hive2://gdlt-b-master03.cnbdcu.com:10000/odm;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2?tez.queue.name=public 20/09/24 17:17:54 INFO HiveConnection: Will retry opening client transport 20/09/24 17:17:54 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://gdlt-b-master03.cnbdcu.com:10000/odm;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2?tez.queue.name=public 20/09/24 17:17:54 ERROR TSaslTransport: SASL negotiation failure javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:95) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:38) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:206) at org.apache.hive.jdbc.HiveConnection.(HiveConnection.java:178) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:208) at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:142) at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:207) at org.apache.hive.beeline.BeeLine.assertConnection(BeeLine.java:1171) at org.apache.hive.beeline.Commands.execute(Commands.java:794) at org.apache.hive.beeline.Commands.sql(Commands.java:713) at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:973) at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:720) at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:757) at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:484) at org.apache.hive.beeline.BeeLine.main(BeeLine.java:467) Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt) at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ... 24 more 20/09/24 17:17:54 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://gdlt-b-master03.cnbdcu.com:10000/odm;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2?tez.queue.name=public 20/09/24 17:17:54 INFO CuratorFrameworkImpl: Starting [INFO] 2020-09-24 17:17:54.354 - [taskAppId=TASK-6-21-23]:[121] - -> 20/09/24 17:17:54 INFO ZooKeeper: Initiating client connection, connectString=gdlt-b-master01.cnbdcu.com:2181,gdlt-b-master02.cnbdcu.com:2181,gdlt-b-master03.cnbdcu.com:2181,gdlt-b-master04.cnbdcu.com:2181,gdlt-b-master05.cnbdcu.com:2181 sessionTimeout=60000 watcher=org.apache.curator.ConnectionState@37654521 20/09/24 17:17:54 INFO ClientCnxn: Opening socket connection to server gdlt-b-master02.cnbdcu.com/172.17.8.185:2181. Will not attempt to authenticate using SASL (unknown error) 20/09/24 17:17:54 INFO ClientCnxn: Socket connection established, initiating session, client: /172.17.9.41:43204, server: gdlt-b-master02.cnbdcu.com/172.17.8.185:2181 20/09/24 17:17:54 INFO ClientCnxn: Session establishment complete on server gdlt-b-master02.cnbdcu.com/172.17.8.185:2181, sessionid = 0x274444f679746de, negotiated timeout = 60000 20/09/24 17:17:54 INFO ConnectionStateManager: State change: CONNECTED 20/09/24 17:17:54 INFO CuratorFrameworkImpl: backgroundOperationsLoop exiting 20/09/24 17:17:54 INFO ZooKeeper: Session: 0x274444f679746de closed 20/09/24 17:17:54 INFO ClientCnxn: EventThread shut down 20/09/24 17:17:54 INFO Utils: Selected HiveServer2 instance with uri: jdbc:hive2://gdlt-b-master03.cnbdcu.com:10000/odm;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2?tez.queue.name=public 20/09/24 17:17:54 INFO HiveConnection: Will retry opening client transport 20/09/24 17:17:54 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://gdlt-b-master03.cnbdcu.com:10000/odm;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2?tez.queue.name=public 20/09/24 17:17:54 ERROR TSaslTransport: SASL negotiation failure javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:95) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:38) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:206) at org.apache.hive.jdbc.HiveConnection.(HiveConnection.java:178) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:208) at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:142) at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:207) at org.apache.hive.beeline.BeeLine.assertConnection(BeeLine.java:1171) at org.apache.hive.beeline.Commands.execute(Commands.java:794) at org.apache.hive.beeline.Commands.sql(Commands.java:713) at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:973) at org.apache.hive.beeline.BeeLine.initArgs(BeeLine.java:720) at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:757) at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:484) at org.apache.hive.beeline.BeeLine.main(BeeLine.java:467) Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt) at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ... 24 more 20/09/24 17:17:54 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://gdlt-b-master03.cnbdcu.com:10000/odm;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2?tez.queue.name=public 20/09/24 17:17:54 INFO CuratorFrameworkImpl: Starting 20/09/24 17:17:54 INFO ZooKeeper: Initiating client connection, connectString=gdlt-b-master01.cnbdcu.com:2181,gdlt-b-master02.cnbdcu.com:2181,gdlt-b-master03.cnbdcu.com:2181,gdlt-b-master04.cnbdcu.com:2181,gdlt-b-master05.cnbdcu.com:2181 sessionTimeout=60000 watcher=org.apache.curator.ConnectionState@7a419da4 20/09/24 17:17:54 INFO ClientCnxn: Opening socket connection to server gdlt-b-master05.cnbdcu.com/172.17.8.186:2181. Will not attempt to authenticate using SASL (unknown error) 20/09/24 17:17:54 INFO ClientCnxn: Socket connection established, initiating session, client: /172.17.9.41:56780, server: gdlt-b-master05.cnbdcu.com/172.17.8.186:2181 20/09/24 17:17:54 INFO ClientCnxn: Session establishment complete on server gdlt-b-master05.cnbdcu.com/172.17.8.186:2181, sessionid = 0x572d96085e1a946, negotiated timeout = 60000 20/09/24 17:17:54 INFO ConnectionStateManager: State change: CONNECTED 20/09/24 17:17:54 INFO CuratorFrameworkImpl: backgroundOperationsLoop exiting 20/09/24 17:17:54 INFO ZooKeeper: Session: 0x572d96085e1a946 closed 20/09/24 17:17:54 INFO ClientCnxn: EventThread shut down No current connection 20/09/24 17:17:54 INFO Utils: Supplied authorities: gdlt-b-master01.cnbdcu.com:2181,gdlt-b-master02.cnbdcu.com:2181,gdlt-b-master03.cnbdcu.com:2181,gdlt-b-master04.cnbdcu.com:2181,gdlt-b-master05.cnbdcu.com:2181 20/09/24 17:17:54 INFO CuratorFrameworkImpl: Starting 20/09/24 17:17:54 INFO ZooKeeper: Initiating client connection, connectString=gdlt-b-master01.cnbdcu.com:2181,gdlt-b-master02.cnbdcu.com:2181,gdlt-b-master03.cnbdcu.com:2181,gdlt-b-master04.cnbdcu.com:2181,gdlt-b-master05.cnbdcu.com:2181 sessionTimeout=60000 watcher=org.apache.curator.ConnectionState@3d5c822d 20/09/24 17:17:54 INFO ClientCnxn: Opening socket connection to server gdlt-b-master04.cnbdcu.com/172.17.8.182:2181. Will not attempt to authenticate using SASL (unknown error) 20/09/24 17:17:54 INFO ClientCnxn: Socket connection established, initiating session, client: /172.17.9.41:45660, server: gdlt-b-master04.cnbdcu.com/172.17.8.182:2181 20/09/24 17:17:54 INFO ClientCnxn: Session establishment complete on server gdlt-b-master04.cnbdcu.com/172.17.8.182:2181, sessionid = 0x472d9195dbfe123, negotiated timeout = 60000 [INFO] 2020-09-24 17:17:54.377 - [taskAppId=TASK-6-21-23]:[121] - -> 20/09/24 17:17:54 INFO ConnectionStateManager: State change: CONNECTED 20/09/24 17:17:54 INFO CuratorFrameworkImpl: backgroundOperationsLoop exiting 20/09/24 17:17:54 INFO ZooKeeper: Session: 0x472d9195dbfe123 closed 20/09/24 17:17:54 INFO ClientCnxn: EventThread shut down 20/09/24 17:17:54 INFO Utils: Resolved authority: gdlt-b-master03.cnbdcu.com:10000 20/09/24 17:17:54 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://gdlt-b-master03.cnbdcu.com:10000/odm;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2?tez.queue.name=public 20/09/24 17:17:54 ERROR TSaslTransport: SASL negotiation failure javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:95) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:38) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:206) at org.apache.hive.jdbc.HiveConnection.(HiveConnection.java:178) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:208) at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:142) at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:207) at org.apache.hive.beeline.Commands.close(Commands.java:987) at org.apache.hive.beeline.Commands.closeall(Commands.java:969) at org.apache.hive.beeline.BeeLine.close(BeeLine.java:826) at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:773) at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:484) at org.apache.hive.beeline.BeeLine.main(BeeLine.java:467) Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt) at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ... 22 more 20/09/24 17:17:54 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://gdlt-b-master03.cnbdcu.com:10000/odm;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2?tez.queue.name=public 20/09/24 17:17:54 INFO CuratorFrameworkImpl: Starting 20/09/24 17:17:54 INFO ZooKeeper: Initiating client connection, connectString=gdlt-b-master01.cnbdcu.com:2181,gdlt-b-master02.cnbdcu.com:2181,gdlt-b-master03.cnbdcu.com:2181,gdlt-b-master04.cnbdcu.com:2181,gdlt-b-master05.cnbdcu.com:2181 sessionTimeout=60000 watcher=org.apache.curator.ConnectionState@6d4e5011 20/09/24 17:17:54 INFO ClientCnxn: Opening socket connection to server gdlt-b-master03.cnbdcu.com/172.17.8.183:2181. Will not attempt to authenticate using SASL (unknown error) 20/09/24 17:17:54 INFO ClientCnxn: Socket connection established, initiating session, client: /172.17.9.41:58646, server: gdlt-b-master03.cnbdcu.com/172.17.8.183:2181 20/09/24 17:17:54 INFO ClientCnxn: Session establishment complete on server gdlt-b-master03.cnbdcu.com/172.17.8.183:2181, sessionid = 0x372d9195d92c460, negotiated timeout = 60000 20/09/24 17:17:54 INFO ConnectionStateManager: State change: CONNECTED 20/09/24 17:17:54 INFO CuratorFrameworkImpl: backgroundOperationsLoop exiting 20/09/24 17:17:54 INFO ZooKeeper: Session: 0x372d9195d92c460 closed 20/09/24 17:17:54 INFO ClientCnxn: EventThread shut down 20/09/24 17:17:54 INFO Utils: Selected HiveServer2 instance with uri: jdbc:hive2://gdlt-b-master03.cnbdcu.com:10000/odm;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2?tez.queue.name=public 20/09/24 17:17:54 INFO HiveConnection: Will retry opening client transport 20/09/24 17:17:54 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://gdlt-b-master03.cnbdcu.com:10000/odm;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2?tez.queue.name=public 20/09/24 17:17:54 ERROR TSaslTransport: SASL negotiation failure javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:95) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:38) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) [INFO] 2020-09-24 17:17:54.389 - [taskAppId=TASK-6-21-23]:[121] - -> at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:206) at org.apache.hive.jdbc.HiveConnection.(HiveConnection.java:178) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:208) at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:142) at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:207) at org.apache.hive.beeline.Commands.close(Commands.java:987) at org.apache.hive.beeline.Commands.closeall(Commands.java:969) at org.apache.hive.beeline.BeeLine.close(BeeLine.java:826) at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:773) at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:484) at org.apache.hive.beeline.BeeLine.main(BeeLine.java:467) Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt) at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ... 22 more 20/09/24 17:17:54 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://gdlt-b-master03.cnbdcu.com:10000/odm;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2?tez.queue.name=public 20/09/24 17:17:54 INFO CuratorFrameworkImpl: Starting 20/09/24 17:17:54 INFO ZooKeeper: Initiating client connection, connectString=gdlt-b-master01.cnbdcu.com:2181,gdlt-b-master02.cnbdcu.com:2181,gdlt-b-master03.cnbdcu.com:2181,gdlt-b-master04.cnbdcu.com:2181,gdlt-b-master05.cnbdcu.com:2181 sessionTimeout=60000 watcher=org.apache.curator.ConnectionState@156b88f5 20/09/24 17:17:54 INFO ClientCnxn: Opening socket connection to server gdlt-b-master04.cnbdcu.com/172.17.8.182:2181. Will not attempt to authenticate using SASL (unknown error) 20/09/24 17:17:54 INFO ClientCnxn: Socket connection established, initiating session, client: /172.17.9.41:45668, server: gdlt-b-master04.cnbdcu.com/172.17.8.182:2181 20/09/24 17:17:54 INFO ClientCnxn: Session establishment complete on server gdlt-b-master04.cnbdcu.com/172.17.8.182:2181, sessionid = 0x472d9195dbfe124, negotiated timeout = 60000 20/09/24 17:17:54 INFO ConnectionStateManager: State change: CONNECTED 20/09/24 17:17:54 INFO CuratorFrameworkImpl: backgroundOperationsLoop exiting 20/09/24 17:17:54 INFO ZooKeeper: Session: 0x472d9195dbfe124 closed 20/09/24 17:17:54 INFO ClientCnxn: EventThread shut down 20/09/24 17:17:54 INFO Utils: Selected HiveServer2 instance with uri: jdbc:hive2://gdlt-b-master03.cnbdcu.com:10000/odm;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2?tez.queue.name=public 20/09/24 17:17:54 INFO HiveConnection: Will retry opening client transport 20/09/24 17:17:54 INFO HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://gdlt-b-master03.cnbdcu.com:10000/odm;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2?tez.queue.name=public 20/09/24 17:17:54 ERROR TSaslTransport: SASL negotiation failure javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211) at org.apache.thrift.transport.TSaslClientTransport.handleSaslStartMessage(TSaslClientTransport.java:95) at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:271) at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:38) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730) at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49) at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:206) at org.apache.hive.jdbc.HiveConnection.(HiveConnection.java:178) at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:208) at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:142) at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:207) at org.apache.hive.beeline.Commands.close(Commands.java:987) at org.apache.hive.beeline.Commands.closeall(Commands.java:969) at org.apache.hive.beeline.BeeLine.close(BeeLine.java:826) at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:773) at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:484) at org.apache.hive.beeline.BeeLine.main(BeeLine.java:467) Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt) at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122) [INFO] 2020-09-24 17:17:54.428 - [taskAppId=TASK-6-21-23]:[121] - -> at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192) ... 22 more 20/09/24 17:17:54 INFO HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://gdlt-b-master03.cnbdcu.com:10000/odm;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=hiveserver2?tez.queue.name=public 20/09/24 17:17:54 INFO CuratorFrameworkImpl: Starting 20/09/24 17:17:54 INFO ZooKeeper: Initiating client connection, connectString=gdlt-b-master01.cnbdcu.com:2181,gdlt-b-master02.cnbdcu.com:2181,gdlt-b-master03.cnbdcu.com:2181,gdlt-b-master04.cnbdcu.com:2181,gdlt-b-master05.cnbdcu.com:2181 sessionTimeout=60000 watcher=org.apache.curator.ConnectionState@7fee8714 20/09/24 17:17:54 INFO ClientCnxn: Opening socket connection to server gdlt-b-master01.cnbdcu.com/172.17.8.184:2181. Will not attempt to authenticate using SASL (unknown error) 20/09/24 17:17:54 INFO ClientCnxn: Socket connection established, initiating session, client: /172.17.9.41:46276, server: gdlt-b-master01.cnbdcu.com/172.17.8.184:2181 20/09/24 17:17:54 INFO ClientCnxn: Session establishment complete on server gdlt-b-master01.cnbdcu.com/172.17.8.184:2181, sessionid = 0x172d9195d1ede38, negotiated timeout = 60000 20/09/24 17:17:54 INFO ConnectionStateManager: State change: CONNECTED 20/09/24 17:17:54 INFO CuratorFrameworkImpl: backgroundOperationsLoop exiting 20/09/24 17:17:54 INFO ZooKeeper: Session: 0x172d9195d1ede38 closed 20/09/24 17:17:54 INFO ClientCnxn: EventThread shut down Error: Could not open client transport for any of the Server URI's in ZooKeeper: Unable to read HiveServer2 configs from ZooKeeper (state=08S01,code=0)

qiaozhanwei commented 4 years ago

1,Whether the service has enabled kerberos ? 2,If it is enabled, check principal is it valid

Thx

shiliquan commented 4 years ago

image image image 关于kerberos的设置都在这了,确已生效,shell命令行执行可以,放在ds里就是不行

liuxuedongcn commented 3 years ago

Has the problem been solved?

CalvinKirs commented 3 years ago

Has the problem been solved?

Do you have any good suggestions?

liuxuedongcn commented 3 years ago

Has the problem been solved?

Do you have any good suggestions? no, I have similar problem,Kerberos authentication is no problem but Failed to create hive data source with kerberos authentication

liuxuedongcn commented 3 years ago

Has the problem been solved?

Do you have any good suggestions? no, I have similar problem,Kerberos authentication is no problem but Failed to create hive data source with kerberos authentication

The solution is to change the configuration in common.properties to resource.storage.type=HDFS 解决方法为 将common.properties中的配置改为resource.storage.type=HDFS

shiliquan commented 3 years ago

原配置文件中本来就是配的hdfs啊?难到是空格有问题? image

liuxuedongcn commented 3 years ago

原配置文件中本来就是配的hdfs啊?难到是空格有问题? image

hadoop.security.authentication.startup.state=true 这个配置了吗

shiliquan commented 3 years ago

所有的配置都在这了 请看 #

Licensed to the Apache Software Foundation (ASF) under one or more

contributor license agreements. See the NOTICE file distributed with

this work for additional information regarding copyright ownership.

The ASF licenses this file to You under the Apache License, Version 2.0

(the "License"); you may not use this file except in compliance with

the License. You may obtain a copy of the License at

#

http://www.apache.org/licenses/LICENSE-2.0

#

Unless required by applicable law or agreed to in writing, software

distributed under the License is distributed on an "AS IS" BASIS,

WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.

See the License for the specific language governing permissions and

limitations under the License.

#

resource.storage.type=HDFS

resource.storage.type=HDFS

resource store on HDFS/S3 path, resource file will store to this hadoop hdfs path, self configuration, please make sure the directory exists on hdfs and have read write permissions。"/dolphinscheduler" is recommended

#

Licensed to the Apache Software Foundation (ASF) under one or more

contributor license agreements. See the NOTICE file distributed with

this work for additional information regarding copyright ownership.

The ASF licenses this file to You under the Apache License, Version 2.0

(the "License"); you may not use this file except in compliance with

the License. You may obtain a copy of the License at

#

http://www.apache.org/licenses/LICENSE-2.0

#

Unless required by applicable law or agreed to in writing, software

distributed under the License is distributed on an "AS IS" BASIS,

WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.

See the License for the specific language governing permissions and

limitations under the License.

#

resource.storage.type=HDFS

resource.storage.type=HDFS

resource store on HDFS/S3 path, resource file will store to this hadoop hdfs path, self configuration, please make sure the directory exists on hdfs and have read write permissions。"/dol

phinscheduler" is recommended resource.upload.path=/home/dolphinscheduler/dolphinscheduler

user data local directory path, please make sure the directory exists and have read write permissions

data.basedir.path=/tmp/dolphinscheduler

whether kerberos starts

hadoop.security.authentication.startup.state=true

java.security.krb5.conf.path=/opt/soft/dolphinscheduler/conf/krb5.conf

java.security.krb5.conf.path=/opt/soft/dolphinscheduler/conf/krb5.conf

login user from keytab username

login.user.keytab.username=dam@CNBDCU.COM

loginUserFromKeytab path

login.user.keytab.path=/opt/soft/dolphinscheduler/conf/dam.keytab

resource.view.suffixs

resource.view.suffixs=txt,log,sh,conf,cfg,py,java,sql,hql,xml,properties

if resource.storage.type=HDFS

hdfs.root.user=dolphinscheduler

if resource.storage.type=HDFS

fs.defaultFS=hdfs://hdpv3testns1:8020

if resource.storage.type=HDFS

fs.s3a.endpoint=

if resource.storage.type=HDFS

fs.s3a.access.key=

if resource.storage.type=HDFS

fs.s3a.secret.key=

if resourcemanager HA enable, please type the HA ips ; if resourcemanager is single, make this value empty

yarn.resourcemanager.ha.rm.ids=xxxx,xxxx

if resourcemanager HA enable or not use resourcemanager, please keep the default value; If resourcemanager is single, you only need to replace ds1 to actual resourcemanager hostname.

yarn.application.status.address=http://yarnIp1:8088/ws/v1/cluster/apps/%s

system env path

dolphinscheduler.env.path=env/dolphinscheduler_env.sh

development.state=false kerberos.expire.time=24

shiliquan commented 3 years ago

现在的问题是:重启下work进程 就又恢复了,但是过段时间后就又不行了,还得重启work

liuxuedongcn commented 3 years ago

现在的问题是:重启下work进程 就又恢复了,但是过段时间后就又不行了,还得重启work

是票据过期了吧

shiliquan commented 3 years ago

对,报的就是Mechanism level: Failed to find any Kerberos tgt,可是我很困惑啊,系统票据认证我设置的定时任务,每个小时认证一次,不会有问题的啊,而且在work主机的shell命令行是可以直接访问hdfs的,所以 我就不知道ds关于票剧的认证是不是还有其他设置?您这边当时遇到的问题是怎么解决的呢?

chengshiwen commented 3 years ago

@shiliquan The unit of kerberos.expire.time is day under ds version <= 1.3.5. You can try to set it to 1

hkun0120 commented 1 year ago

对,报的就是Mechanism level: Failed to find any Kerberos tgt,可是我很困惑啊,系统票据认证我设置的定时任务,每个小时认证一次,不会有问题的啊,而且在work主机的shell命令行是可以直接访问hdfs的,所以 我就不知道ds关于票剧的认证是不是还有其他设置?您这边当时遇到的问题是怎么解决的呢?

你解决了么?后来是咋好的