Open njalan opened 1 year ago
@njalan Could you please share some environment details, such as OS, JDK version, etc? Asking because the stacktrace says Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make field transient java.lang.Object[] java.util.ArrayList.elementData accessible: module java.base does not "opens java.util" to unnamed module @6a56a99a
. I believe this is due to reflective access changes in in higher java versions. See https://github.com/apache/hudi/pull/6657
If you have steps to reproduce, that would be great. The support issue template is a good way to mention details about the issue.
Tips before filing an issue
Have you gone through our FAQs?
Join the mailing list to engage in conversations and get faster support at dev-subscribe@hudi.apache.org.
If you have triaged this as a bug, then file an issue directly.
Describe the problem you faced
A clear and concise description of the problem.
To Reproduce
Steps to reproduce the behavior:
1. 2. 3. 4.
Expected behavior
A clear and concise description of what you expected to happen.
Environment Description
Hudi version :
JDK version:
Spark version :
Hive version :
Presto version :
Trino version:
Hadoop version :
Storage (HDFS/S3/GCS..) :
Running on Docker? (yes/no) :
Additional context
Add any other context about the problem here.
Stacktrace
Add the stacktrace of the error.
To Reproduce
Steps to reproduce the behavior: 1.create hudi parameter with below parameters: .option("hoodie.metadata.enable", "true") .option("hoodie.metadata.index.column.stats.column.list", "parameterid") .option("hoodie.metadata.index.bloom.filter.column.list", "parameterid") .option("hoodie.metadata.index.bloom.filter.enable", "true") .option("hoodie.metadata.index.column.stats.enable", "true") .option("hoodie.enable.data.skipping", "true")
A clear and concise description of what you expected to happen.
Environment Description
Hudi version :0.9.0
JDK version: Zulu17.36+13-CA (build 17.0.4+8-LTS)
Spark version : 3.0.1
Hive version : 3.1.2
Presto version : 350 (it is working file with presto 350)
Trino version: 394(not working with trino 394)
Hadoop version : 3.2.2
Storage (HDFS/S3/GCS..) : s3
Running on Docker? (yes/no) :no
Additional context
Add any other context about the problem here.
Stacktrace
Add the stacktrace of the error.
Hey Team, Any update on this?
I got below error message when query hudi(0.11) by Trino(394) when using DBeaver:
io.trino.spi.TrinoException: Error fetching partition paths from metadata table at io.trino.plugin.hive.BackgroundHiveSplitLoader$HiveSplitLoaderTask.process(BackgroundHiveSplitLoader.java:281) at io.trino.plugin.hive.util.ResumableTasks$1.run(ResumableTasks.java:38) at io.trino.$gen.Trino_394____20221114_071058_2.run(Unknown Source) at io.airlift.concurrent.BoundedExecutor.drainQueue(BoundedExecutor.java:80) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) at java.base/java.lang.Thread.run(Thread.java:833) Caused by: org.apache.hudi.exception.HoodieException: Error fetching partition paths from metadata table at org.apache.hudi.common.fs.FSUtils.getAllPartitionPaths(FSUtils.java:315) at org.apache.hudi.BaseHoodieTableFileIndex.getAllQueryPartitionPaths(BaseHoodieTableFileIndex.java:182) at org.apache.hudi.BaseHoodieTableFileIndex.loadPartitionPathFiles(BaseHoodieTableFileIndex.java:225) at org.apache.hudi.BaseHoodieTableFileIndex.doRefresh(BaseHoodieTableFileIndex.java:270) at org.apache.hudi.BaseHoodieTableFileIndex.(BaseHoodieTableFileIndex.java:140)
at org.apache.hudi.hadoop.HiveHoodieTableFileIndex.(HiveHoodieTableFileIndex.java:49)
at org.apache.hudi.hadoop.HoodieCopyOnWriteTableInputFormat.listStatusForSnapshotMode(HoodieCopyOnWriteTableInputFormat.java:239)
at org.apache.hudi.hadoop.HoodieCopyOnWriteTableInputFormat.listStatus(HoodieCopyOnWriteTableInputFormat.java:146)
at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:325)
at org.apache.hudi.hadoop.HoodieParquetInputFormatBase.getSplits(HoodieParquetInputFormatBase.java:68)
at io.trino.plugin.hive.BackgroundHiveSplitLoader.lambda$loadPartition$2(BackgroundHiveSplitLoader.java:489)
at io.trino.hdfs.authentication.NoHdfsAuthentication.doAs(NoHdfsAuthentication.java:25)
at io.trino.hdfs.HdfsEnvironment.doAs(HdfsEnvironment.java:94)
at io.trino.plugin.hive.BackgroundHiveSplitLoader.loadPartition(BackgroundHiveSplitLoader.java:489)
at io.trino.plugin.hive.BackgroundHiveSplitLoader.loadSplits(BackgroundHiveSplitLoader.java:350)
at io.trino.plugin.hive.BackgroundHiveSplitLoader$HiveSplitLoaderTask.process(BackgroundHiveSplitLoader.java:274)
... 6 more
Caused by: org.apache.hudi.exception.HoodieMetadataException: Failed to retrieve list of partition from metadata
at org.apache.hudi.metadata.BaseTableMetadata.getAllPartitionPaths(BaseTableMetadata.java:113)
at org.apache.hudi.common.fs.FSUtils.getAllPartitionPaths(FSUtils.java:313)
... 21 more
Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make field transient java.lang.Object[] java.util.ArrayList.elementData accessible: module java.base does not "opens java.util" to unnamed module @6a56a99a
at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:354)
at java.base/java.lang.reflect.AccessibleObject.checkCanSetAccessible(AccessibleObject.java:297)
at java.base/java.lang.reflect.Field.checkCanSetAccessible(Field.java:178)
at java.base/java.lang.reflect.Field.setAccessible(Field.java:172)
at org.apache.hudi.common.util.ObjectSizeCalculator$ClassSizeInfo.(ObjectSizeCalculator.java:246)
at org.apache.hudi.common.util.ObjectSizeCalculator.getClassSizeInfo(ObjectSizeCalculator.java:140)
at org.apache.hudi.common.util.ObjectSizeCalculator.access$400(ObjectSizeCalculator.java:56)
at org.apache.hudi.common.util.ObjectSizeCalculator$ClassSizeInfo.(ObjectSizeCalculator.java:253)
at org.apache.hudi.common.util.ObjectSizeCalculator.getClassSizeInfo(ObjectSizeCalculator.java:140)
at org.apache.hudi.common.util.ObjectSizeCalculator.visit(ObjectSizeCalculator.java:158)
at org.apache.hudi.common.util.ObjectSizeCalculator.calculateObjectSize(ObjectSizeCalculator.java:124)
at org.apache.hudi.common.util.ObjectSizeCalculator.getObjectSize(ObjectSizeCalculator.java:74)
at org.apache.hudi.common.util.HoodieRecordSizeEstimator.(HoodieRecordSizeEstimator.java:43)
at org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.(HoodieMergedLogRecordScanner.java:95)
at org.apache.hudi.metadata.HoodieMetadataMergedLogRecordReader.(HoodieMetadataMergedLogRecordReader.java:63)
at org.apache.hudi.metadata.HoodieMetadataMergedLogRecordReader.(HoodieMetadataMergedLogRecordReader.java:51)
at org.apache.hudi.metadata.HoodieMetadataMergedLogRecordReader$Builder.build(HoodieMetadataMergedLogRecordReader.java:230)
at org.apache.hudi.metadata.HoodieBackedTableMetadata.getLogRecordScanner(HoodieBackedTableMetadata.java:508)
at org.apache.hudi.metadata.HoodieBackedTableMetadata.getLogRecordScanner(HoodieBackedTableMetadata.java:470)
at org.apache.hudi.metadata.HoodieBackedTableMetadata.openReaders(HoodieBackedTableMetadata.java:416)
at org.apache.hudi.metadata.HoodieBackedTableMetadata.lambda$getOrCreateReaders$11(HoodieBackedTableMetadata.java:402)
at java.base/java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1708)
at org.apache.hudi.metadata.HoodieBackedTableMetadata.getOrCreateReaders(HoodieBackedTableMetadata.java:402)
at org.apache.hudi.metadata.HoodieBackedTableMetadata.lambda$getRecordsByKeys$1(HoodieBackedTableMetadata.java:211)
at java.base/java.util.HashMap.forEach(HashMap.java:1421)
at org.apache.hudi.metadata.HoodieBackedTableMetadata.getRecordsByKeys(HoodieBackedTableMetadata.java:209)
at org.apache.hudi.metadata.HoodieBackedTableMetadata.getRecordByKey(HoodieBackedTableMetadata.java:141)
at org.apache.hudi.metadata.BaseTableMetadata.fetchAllPartitionPaths(BaseTableMetadata.java:281)
at org.apache.hudi.metadata.BaseTableMetadata.getAllPartitionPaths(BaseTableMetadata.java:111)
... 22 more