Open Rock-Lee-520 opened 2 years ago
@Rock-liyi Could you please share the full stack trace? Not sure if you have a similar issue described in this thread? - https://stackoverflow.com/questions/49613451/unable-to-connect-to-hdfs-data-node-from-remote-client
@Rock-liyi Could you please share the full stack trace? Not sure if you have a similar issue described in this thread? - https://stackoverflow.com/questions/49613451/unable-to-connect-to-hdfs-data-node-from-remote-client
Thank for your help , this is full stack trace.
`
java.lang.UnsupportedOperationException: readDirect unsupported in RemoteBlockReader
at org.apache.hadoop.hdfs.RemoteBlockReader.read(RemoteBlockReader.java:492)
at org.apache.hadoop.hdfs.DFSInputStream$ByteBufferStrategy.doRead(DFSInputStream.java:789)
at org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.java:823)
at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:883)
at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:938)
at org.apache.hadoop.fs.FSDataInputStream.read(FSDataInputStream.java:143)
at org.apache.parquet.hadoop.util.H2SeekableInputStream$H2Reader.read(H2SeekableInputStream.java:81)
at org.apache.parquet.hadoop.util.H2SeekableInputStream.readFully(H2SeekableInputStream.java:90)
at org.apache.parquet.hadoop.util.H2SeekableInputStream.readFully(H2SeekableInputStream.java:75)
at org.apache.parquet.hadoop.ParquetFileReader.readFooter(ParquetFileReader.java:575)
at org.apache.parquet.hadoop.ParquetFileReader.
`
Hi @Rock-liyi, As per stacktrace it looks like some issue reading data from data nodes. Did you try propety given in this link?
<property>
<name>dfs.client.use.datanode.hostname</name>
<value>true</value>
<description>Whether clients should use datanode hostnames when
connecting to datanodes.
</description>
</property>
Hi @Rock-liyi, As per stacktrace it looks like some issue reading data from data nodes. Did you try propety given in this link?
<property> <name>dfs.client.use.datanode.hostname</name> <value>true</value> <description>Whether clients should use datanode hostnames when connecting to datanodes. </description> </property>
I tried to use this link, but it did not work.
<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
<name>dfs.namenode.secondary.http-address</name>
<value>master:50090</value>
</property>
<property>
<name>dfs.replication</name>
<value>3</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>file:/mnt/hadoop/tmp/name</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>file:/mnt/hadoop/tmp/data</value>
</property>
<property>
<name>dfs.client.use.datanode.hostname</name>
<value>true</value>
<description>Whether clients should use datanode hostnames when
connecting to datanodes.
</description>
</property>
<property>
<name>dfs.client.use.legacy.blockreader</name>
<value>false</value>
</property>
</configuration>
@agrawalreetika Could you help me please?
It will get error when query a big data file
Are you using presto-delta connector? And which version of presto are you on?
@agrawalreetika
It could be that there are some problems with hadoop-apache2-2.7.4-9.jar,you can try removing core-site.xml to circumvent the problem, since the default is false。
Has anyone managed to come up with a workaround? Can the delta connector be configured to read some external Hadoop configuration?
@daniellj Posted solution for above issue what worked for him here - https://github.com/prestodb/presto/issues/21283
Hi, everyone I got an error which used presto to query deltalake's data , but it notices me this error.
I changed the configuration in presto/etc/catalog/hdfs.xml ,but it does not work.
`
`
So, can anyone help me ?