The world's fastest open query engine for sub-second analytics both on and off the data lakehouse. With the flexibility to support nearly any scenario, StarRocks provides best-in-class performance for multi-dimensional analytics, real-time analytics, and ad-hoc queries. A Linux Foundation project.
问题描述: 创建hdfs原生表后,可以正常查询,但是一天后,查询失败,提示starlet err Open hdfs file xxx.dat 上机器看文件是存在的
日志报错
storage volume创建语句 CREATE STORAGE VOLUME def_volume TYPE = HDFS LOCATIONS = ("xxx.db") PROPERTIES ( "enable"="true", "hadoop.security.authentication" = "kerberos", "hadoop.security.kerberos.ticket.cache.path" = "/tmp/krb5cc_999" );
现在排查思路是hdfs认证,因为每次都是一天后失败,而刚好认证的ticket cache的有效期是一天,希望各位大佬帮忙看下