As found in #69, if the row size is above 1 KB, and the option recordLength is not specified, the code fails with the error java.lang.ArithmeticException: / by zero.
For the background: the recordlength option controls how the data is split and read inside each HDFS block (or more precisely inside each InputSplit as they are not the same) by individual mappers for processing. 1 KB seemed to give good performance (and is large enough for most of the FITS I was using so far), and for larger value you might suffer from a longer garbage collector time.
As found in #69, if the row size is above 1 KB, and the option
recordLength
is not specified, the code fails with the errorjava.lang.ArithmeticException: / by zero
.For the background: the
recordlength
option controls how the data is split and read inside each HDFS block (or more precisely inside each InputSplit as they are not the same) by individual mappers for processing. 1 KB seemed to give good performance (and is large enough for most of the FITS I was using so far), and for larger value you might suffer from a longer garbage collector time.However, the current error is not very explicit.