Closed zzbennett closed 7 years ago
Shoot, I realized I had actually forgotten to set the s3.url and hadoop.conf.dir properties so that could have something to do with it.
Hi @zzbennett Did you get past this issue ?
I did, but I manually created the file. I'm still not sure if forgetting to set the s3.url and hadoop.conf.dir had something to do with it. If not, then the bug still persists, and if so, then more error handling would be great.
Hi @zzbennett This is the side-effect of not passing "s3.url". I will add some config checks to ensure, s3.url and hadoop.conf.dir as mandatory and log and shut down if its not provided.
Closing this issue, this should not happen anymore. StreamX now uses DummyWAL by default (no WAL), and it can be configured to use RDSWAL instead.
I was following the quickstart and when I tried starting the connector for the first time, I got this exception:
I found this statement the HDFS connector doc file:
note: You need to make sure the connector user have write access to the directories specified in
topics.dir
andlogs.dir
. The default value oftopics.dir
is/topics
and the default value oflogs.dir
is/logs
, if you don't specify the two configurations, make sure that the connector user has write access to/topics
and/logs
. You may need to create/topics
and/logs
before running the connector as the connector usually don't have write access to/
.I did this, however, the error still persists. I then created the file manually and it seems to have resolved the problem. Possibly a bug? I'm using the NativeS3FileSystem, but other than that I just followed the quick start.