prestodb / presto

The official home of the Presto distributed SQL query engine for big data
http://prestodb.io
Apache License 2.0
15.97k stars 5.35k forks source link

Error when connecting to hive with Kerberos authentication and hive without Kerberos authentication at the same time #21070

Open 126wang opened 12 months ago

126wang commented 12 months ago

Error when connecting to hive with Kerberos authentication and hive without Kerberos authentication at the same time

When I use presto to connect to hive with Kerberos authentication and hive without Kerberos authentication at the same time, I cannot use select * from hive.database.tablename; the error is reported as follows:

Query failed (#20231008_062119_00000_8mfsp) in your-presto: Failed to list directory: hdfs://longi/edw/dwd/fico_acc/acc_arap/dwd_fico/cust_type. Failed on local exception: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.; Host Details : local host is: "Prestotools01/10.0.142.37"; destination host is: "master01.center.longi":8020;

The error screenshot is as follows: image

The error report in the log is detailed as follows: /presto-server-0.279/data/var/log/server.log

2023-10-08T14:50:50.469+0800    WARN    hive-hive-2     org.apache.hadoop.io.retry.RetryInvocationHandler       Exception while invoking class
 org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getListing over master01.center.name/10.0.80.50:8020. Not retrying beca
use failovers (15) exceeded maximum allowed (15)
java.io.IOException: Failed on local exception: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured
 to only allow secure connections.; Host Details : local host is: "Prestotools01/10.0.142.37"; destination host is: "master01.center.name":8020; 
        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:776)
        at org.apache.hadoop.ipc.Client.call(Client.java:1480)
        at org.apache.hadoop.ipc.Client.call(Client.java:1413)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
        at com.sun.proxy.$Proxy280.getListing(Unknown Source)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getListing(ClientNamenodeProtocolTranslatorPB.java:578)
        at sun.reflect.GeneratedMethodAccessor242.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
        at com.sun.proxy.$Proxy282.getListing(Unknown Source)
        at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:2086)
        at org.apache.hadoop.hdfs.DistributedFileSystem$DirListingIterator.<init>(DistributedFileSystem.java:944)
        at org.apache.hadoop.hdfs.DistributedFileSystem$DirListingIterator.<init>(DistributedFileSystem.java:927)
        at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:872)
        at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:868)
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at org.apache.hadoop.hdfs.DistributedFileSystem.listLocatedStatus(DistributedFileSystem.java:886)
        at org.apache.hadoop.fs.FileSystem.listLocatedStatus(FileSystem.java:1696)
        at org.apache.hadoop.fs.HadoopExtendedFileSystem.listLocatedStatus(HadoopExtendedFileSystem.java:246)
        at com.facebook.presto.hive.HadoopDirectoryLister.lambda$list$0(HadoopDirectoryLister.java:45)
        at com.facebook.presto.hive.util.HiveFileIterator$FileStatusIterator.<init>(HiveFileIterator.java:122)
        at com.facebook.presto.hive.util.HiveFileIterator$FileStatusIterator.<init>(HiveFileIterator.java:110)
        at com.facebook.presto.hive.util.HiveFileIterator.getLocatedFileStatusRemoteIterator(HiveFileIterator.java:99)
        at com.facebook.presto.hive.util.HiveFileIterator.computeNext(HiveFileIterator.java:92)
        at com.facebook.presto.hive.util.HiveFileIterator.computeNext(HiveFileIterator.java:39)
        at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:141)
        at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:136)
        at java.util.Spliterators$IteratorSpliterator.tryAdvance(Spliterators.java:1811)
        at java.util.stream.StreamSpliterators$WrappingSpliterator.lambda$initPartialTraversalState$0(StreamSpliterators.java:295)
        at java.util.stream.StreamSpliterators$AbstractWrappingSpliterator.fillBuffer(StreamSpliterators.java:207)
        at java.util.stream.StreamSpliterators$AbstractWrappingSpliterator.doAdvance(StreamSpliterators.java:162)
        at java.util.stream.StreamSpliterators$WrappingSpliterator.tryAdvance(StreamSpliterators.java:301)
        at java.util.Spliterators$1Adapter.hasNext(Spliterators.java:681)
        at com.facebook.presto.hive.BackgroundHiveSplitLoader.loadSplits(BackgroundHiveSplitLoader.java:195)
        at com.facebook.presto.hive.BackgroundHiveSplitLoader.access$300(BackgroundHiveSplitLoader.java:40)
        at com.facebook.presto.hive.BackgroundHiveSplitLoader$HiveSplitLoaderTask.process(BackgroundHiveSplitLoader.java:121)
        at com.facebook.presto.hive.util.ResumableTasks.safeProcessTask(ResumableTasks.java:47)
        at com.facebook.presto.hive.util.ResumableTasks.access$000(ResumableTasks.java:20)
        at com.facebook.presto.hive.util.ResumableTasks$1.run(ResumableTasks.java:35)
        at com.facebook.airlift.concurrent.BoundedExecutor.drainQueue(BoundedExecutor.java:78)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: Server asks us to fall back to SIMPLE auth, but this client is configured to only allow secure connections.
        at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:755)
        at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:376)
        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1529)
        at org.apache.hadoop.ipc.Client.call(Client.java:1452)
        ... 43 more

The following is my relevant configuration file

hive.metastore.uri=thrift://10.0.80.57:9083 hive.config.resources=/home/presto/mypresto/hadoop/core-site.xml,/home/presto/mypresto/hadoop/hdfs-site.xml hive.metastore-cache-maximum-size=10000 hive.metastore-refresh-max-threads=100 hive.metastore.username=duhy hive.metastore.authentication.type=NONE hive.hdfs.authentication.type=NONE hive.copy-on-first-write-configuration-enabled=false

* /presto-server-0.279/etc/catalog/hivecdp.properties

connector.name=hive-hadoop2

hive.metastore.uri=thrift://mc03.center.name:9083 hive.config.resources=/home/presto/mypresto/hadoop_cdp/core-site.xml,/home/presto/mypresto/hadoop_cdp/hdfs-site.xml

hive.copy-on-first-write-configuration-enabled=false hive.metastore-cache-maximum-size=10000 hive.metastore-refresh-max-threads=100 hive.metastore.username= hive.metastore.authentication.type=KERBEROS hive.metastore.service.principal=hive/_HOST@CENTER.NAME hive.metastore.client.principal=@CENTER.NAME hive.metastore.client.keytab=/home/presto/mypresto/hadoop_cdp/***.keytab

hive.hdfs.authentication.type=KERBEROS hive.hdfs.presto.principal=@CENTER.NAME hive.hdfs.presto.keytab=/home/presto/mypresto/hadoop_cdp/.keytab


* After each configuration update, I will ensure that the configuration file of each node is the same and restart presto.
* Help, thank you!
imjalpreet commented 12 months ago

@126wang Can you please let us know if you are able to query when you are configuring both catalogs separately?

stbogdan77 commented 9 months ago

I'm experiencing the same problem image

If I try to connect via JDBC:POSTGRESQL, the config is like this and I don’t understand the error, if I open the database with these credentials everything works.

image I'm just trying to see what tables are inside my directory