apache / dolphinscheduler

Apache DolphinScheduler is the modern data orchestration platform. Agile to create high performance workflow with low-code
https://dolphinscheduler.apache.org/
Apache License 2.0
12.8k stars 4.6k forks source link

[Question] An error is reported when sqoop is started through a shell scrip:Permission denied #3906

Closed limaoyu012 closed 3 years ago

limaoyu012 commented 4 years ago

An error is reported when sqoop is started through a shell script,pelease help~

Which version of DolphinScheduler: -[1.3.2]

ERROR: [INFO] 2020-10-14 14:37:40.721 - [taskAppId=TASK-11-56-153]:[121] - -> Warning: /opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail. Please set $ACCUMULO_HOME to the root of your Accumulo installation. WARNING: log4j.properties is not found. HADOOP_CONF_DIR may be incomplete. WARNING: log4j.properties is not found. HADOOP_CONF_DIR may be incomplete. WARNING: log4j.properties is not found. HADOOP_CONF_DIR may be incomplete. [INFO] 2020-10-14 14:37:41.887 - [taskAppId=TASK-11-56-153]:[121] - -> SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/jars/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.3.2-1.cdh6.3.2.p0.1605554/jars/log4j-slf4j-impl-2.8.2.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 20/10/14 14:37:41 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7-cdh6.3.2 20/10/14 14:37:41 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead. 20/10/14 14:37:41 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset. 20/10/14 14:37:41 INFO tool.CodeGenTool: Beginning code generation Wed Oct 14 14:37:41 CST 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification. [INFO] 2020-10-14 14:37:43.910 - [taskAppId=TASK-11-56-153]:[121] - -> 20/10/14 14:37:42 INFO manager.SqlManager: Executing SQL statement: SELECT t. FROM ods_hdb_mysql_building AS t LIMIT 1 20/10/14 14:37:42 INFO manager.SqlManager: Executing SQL statement: SELECT t. FROM ods_hdb_mysql_building AS t LIMIT 1 20/10/14 14:37:42 INFO orm.CompilationManager: HADOOP_MAPREDHOME is /opt/cloudera/parcels/CDH/lib/hadoop 20/10/14 14:37:43 **ERROR orm.CompilationManager: Could not rename /tmp/sqoop-hiveops/compile/6776a2fba020c969f7d7501a0ee41894/ods_hdb_mysql_building.java to /tmp/dolphinscheduler/exec/process/4/11/56/153/./ods_hdb_mysql_building.java. Error: /tmp/dolphinscheduler/exec/process/4/11/56/153/./ods_hdb_mysql_building.java (Permission denied)**_ [INFO] 2020-10-14 14:37:44.045 - [taskAppId=TASK-11-56-153]:[121] - -> 20/10/14 14:37:43 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hiveops/compile/6776a2fba020c969f7d7501a0ee41894/ods_hdb_mysql_building.jar 20/10/14 14:37:43 INFO mapreduce.ExportJobBase: Beginning export of ods_hdb_mysql_building Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/mapreduce/Job at org.apache.sqoop.mapreduce.JobBase.createJob(JobBase.java:382) at org.apache.sqoop.mapreduce.ExportJobBase.runExport(ExportJobBase.java:418) at org.apache.sqoop.manager.SqlManager.exportTable(SqlManager.java:930) at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:93) at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:112) at org.apache.sqoop.Sqoop.run(Sqoop.java:146) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:182) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:233) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:242) at org.apache.sqoop.Sqoop.main(Sqoop.java:251) Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.mapreduce.Job at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 11 more

limaoyu012 commented 4 years ago

Supplementary information:

drwxr-x--- 3 dolphinscheduler hive 18 Oct 14 10:11 dolphinscheduler drwxr-x--- 3 hiveops hiveops 21 Oct 14 11:51 sqoop-hiveops

[dolphinscheduler@bigdata-prd-cdh-dn-15 tmp]$ id dolphinscheduler uid=1008(dolphinscheduler) gid=984(hive) groups=984(hive),1007(hiveops) [dolphinscheduler@bigdata-prd-cdh-dn-15 tmp]$ id hiveops uid=1007(hiveops) gid=1007(hiveops) groups=1007(hiveops),994(hadoop),984(hive)

Deploy users:dolphinscheduler tenant :hiveops

xingchun-chen commented 3 years ago

Is sqoop installed on the worker assigned to the task?