Open SGITLOGIN opened 1 month ago
Hi @SGITLOGIN ,
Did you made the right import statement ? Or import the right librairies ?
@lucasbak Hello The Spark configuration is as follows,Almost no other configurations have been modified
@lucasbak Hello I took a look and there is indeed no org.apache.hadoop.hive.ql.plan.LoadTableDesc$LoadFileType class file in /usr/odp/current/spark3-client/jars/hive-exec-2.3.9-core.jar
But there is indeed the org.apache.hadoop.hive.ql.plan.LoadTableDesc$LoadFileType class file in the /usr/odp/current/hive-client/lib/hive-exec-3.1.3.1.2.2.0-130.jar package
Did you try replacing one by the other ?
@lucasbak I tried copying /usr/odp/current/hive-client/lib/hive-exec-3.1.3.1.2.2.0-130.jar to /usr/odp/current/spark3-client/jars/, and it worked. Various other errors are reported, such as the hive-metastore package missing class files, etc.
@SGITLOGIN
did you try to replace all the hiveX2.3.9.jar by hiveX3.1.3.1.2.2.0-130.jar ?
@lucasbak Hi I tried replacing all hiveX2.3.9.jar with hiveX3.1.3.1.2.2.0-130.jar, but the following error java.lang.NoSuchMethodError: org.apache.hadoop.hive.common.FileUtils.mkdir will be reported. I found There is a version incompatibility(https://stackoverflow.com/questions/75880410/java-lang-nosuchmethoderror-org-apache-hadoop-hive-common-fileutils-mkdir-while)
our customer successfully use spark3 with hive 3, they are doing manual import of the librairies. We will check it internally.
@lucasbak Hello "they are doing manual import of the librairies." What are their steps?
We are working internally on Spark 3.5.1 for next ODP release. It will be based on Hive 3.1.3 built by ODP. We will keep you up to date when it is release.
It will be accompanied of scala binary version from 2.12 to 2.13
@lucasbak OK,When is it expected to be released?
@SGITLOGIN
What was the exact sql you was trying to play ? We are reproducing internally the issue on ODP 1.2.2.0
Best regards
@SGITLOGIN
Did you use Spark-shell or custom spark application ? We are trying to reproduce it internally without success
@lucasbak Hello The error message is as follows: java.lang.ClassNotFoundException: org.apache.hadoop.hive.ql.plan.LoadTableDesc$LoadFileType
How to solve this problem?
SQL val insertHuanAdTaskReportSql = s""" |insert overwrite table test.huan_ad_task_report partition(task_id='1444') |select data_type, | media, | province_name, | province_code, | city_name, | city_code, | county_name, | county_code, | region_level, | data_date, | pv, | uv |from ad_monitor.huan_ad_task_report where task_id='1313' """.stripMargin spark.sql(insertHuanAdTaskReportSql)