Open liuzx8888 opened 1 year ago
将flink-shaded-hadoop-uber包放入lib目录里试试
将flink-shaded-hadoop-uber包放入lib目录里试试
`flink lib 添加 flink-shaded-hadoop-2-2.7.5-10.0,出现新的问题:
Log Type: jobmanager.err
Log Upload Time: 星期一 五月 08 10:06:49 +0800 2023
Log Length: 1548
SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/hadoop-3.1.3/data/nm-local-dir/usercache/root/appcache/application_1682257958270_0172/filecache/26/chunjun-connector-binlog.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/hadoop-3.1.3/data/nm-local-dir/usercache/root/appcache/application_1682257958270_0172/filecache/25/chunjun-connector-hive3.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/hadoop-3.1.3/data/nm-local-dir/usercache/root/appcache/application_1682257958270_0172/filecache/16/chunjun-metrics-prometheus.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/hadoop-3.1.3/data/nm-local-dir/usercache/root/appcache/application_1682257958270_0172/filecache/21/log4j-slf4j-impl-2.12.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/hadoop-3.1.3/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] Exception in thread "main" java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.UserGroupInformation at org.apache.flink.yarn.entrypoint.YarnEntrypointUtils.logYarnEnvironmentInformation(YarnEntrypointUtils.java:116) at org.apache.flink.yarn.entrypoint.YarnJobClusterEntrypoint.main(YarnJobClusterEntrypoint.java:83)`
@getwtf 以什么模式启动的
@ll076110 per-job @getwtf 我也遇到相同的问题 您解决了嘛
Search before asking
What happened
版本: FLINK :1.16.1 hadoop :3.3.0 chunjun :master
把依赖添加到flink lib,启动flink失败,去掉 chunjun-dist 就能正常的启动成功
cp -r chunjun-dist $FLINK_HOME/lib
What you expected to happen
How to reproduce
cp -r chunjun-dist $FLINK_HOME/lib
Anything else
No response
Version
master
Are you willing to submit PR?
Code of Conduct