Open aiwenmo opened 2 years ago
Q: TINYINT mapping type is BOOLEAN A: you can use tinyInt1isBit=false cdc-issue:https://github.com/ververica/flink-cdc-connectors/pull/2030/files
在提及flink-on-yarn时报Caused by: java.lang.ClassCastException: org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$GetFileInfoRequestProto cannot be cast to com.google.protobuf.Message这样一个异常
When I create a task, I cannot open it. Double-clicking the task or right-clicking to open it does not respond。
dinky添加phoenix-5.0.0-cdh6.2.0-client.jar启动抛异常,包冲突
Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'requestMappingHandlerAdapter' defined in class path resource [org/springframework/boot/autoconfigure/web/servlet/WebMvcAutoConfiguration$EnableWebMvcConfiguration.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter]: Factory method 'requestMappingHandlerAdapter' threw exception; nested exception is java.lang.NoSuchMethodError: com.jayway.jsonpath.spi.json.JacksonJsonProvider.
根据dinky官方文档next版本,按照CDCSOURCE 整库同步 中的整库同步到mysql示例,任务进行全量同步完成后,数据库新增数据后,增量数据没有同步,flink任务和目标数据库都没有新增变动,执行模式是yarn session
FlinkJar 上传了无法选中包
Hello @, this issue has not been active for more than 30 days. This issue will be closed in 7 days if there is no response. If you have any questions, you can comment and reply.
你好 @, 这个 issue 30 天内没有活跃,7 天后将关闭,如需回复,可以评论回复。
根据dinky官方文档next版本,按照CDCSOURCE 整库同步 中的整库同步到mysql示例,任务进行全量同步完成后,数据库新增数据后,增量数据没有同步,flink任务和目标数据库都没有新增变动,执行模式是yarn session
这个问题解决了吗,我也遇到了类似的问题
dinky1.0版本,本地编译成功之后,代码中还是提示缺少FlinkUtil类?
dinky1.0版本,本地编译成功之后,代码中还是提示缺少FlinkUtil类?
请仔细阅读官网文档本地调试相关,不要断章取义,需逐字查看,以及查看相关截图
dinky1.0版本,本地编译成功之后,代码中还是提示缺少FlinkUtil类? 看一下dinky-flink对应的pom.xml是否缺失依赖包,然后clean后,重新编译
dinky1.0版本,本地编译成功之后,代码中还是提示缺少FlinkUtil类? 看一下dinky-flink对应的pom.xml是否缺失依赖包,然后clean后,重新编译
org.apache.flink flink-core ${flink.version}
请把profile选对
Caused by: org.apache.flink.table.api.ValidationException: Field names must be unique. Found duplicates: [w1$o0, w0$o0]
编写flinksql作业,提交任务出现以上报错;在一个作业中使用with as语法写多个子查询(t1, t2, t3,t4), 每个子查询中会使用一些开窗函数,测试发现在最后一个子查询中使用开窗函数出现以上报错,实际上无这些字段名,字段别名检查也无重复。这个报错的原因是什么;把最后一个子查询拆分到另一个作业单独处理,任务正常提交。
The following list is the latest FAQ (keep up to date).
If you encounter problems in using open source projects, you can find relevant problems and solutions in the issue list below.
If it cannot be solved or does not exist, you can create a bug type issue; If you want to share a FAQ, you can create an issue of FAQ type and associate it in the comment list.
Deploy
Platform
Extends
Flink
Share