Closed msbentley closed 1 year ago
The error is somewhat misleading. pdr
first attempts to read this as a delimiter-separated file (ignoring specified field widths, etc.). However, it fails, because the file contains lines with both 102 and 103 delimited fields (the "SENSOR" lines have 102, the "MODE DATA" lines have 103). After failing, it attempts to parse it using the specified field widths -- typically ASCII files intended to be read this way (because they have weird padding or something) specify START_BYTE. This is the last thing it tries before giving up on the table, so it's the exception at the head of the stack trace, which it then prints in the UserWarning
. It wouldn't actually work even if START_BYTE were present, though, because of the inconsistency in the lines (i.e., the "MODE DATA" lines don't match the field spec in the label). We would generally handle this kind of thing by investigating the data set, seeing if the variable line lengths are a "consistent inconsistency" in the product type, and coding a special case. Do you happen to know if this is an intended/pervasive feature of these data products?
Aha, thanks for digging into that - I'll try to check if there is a good reason for the inconsistency and see if we can even get it fixed in later versions (since this is for an ongoing mission).
Copy that! Thank you for the bug report. Closing this issue for now -- feel free to comment if you find information that is actionable for us, and I will reopen the issue.
I have recently tried to open data products using the SPREADSHEET object (i.e. delimited ASCII tables) but get the error:
As far as I can see from the standard, START_BYTE should not be required here, but I could be wrong! An example product:
http://archives.esac.esa.int/psa/ftp/MARS-EXPRESS/ASPERA-3/MEX-M-ASPERA3-2-EDR-IMA-EXT7-V1.0/DATA/IMA_EDR_L1B_2019_01/IMA_AZ0020190011529C_ACCS01.CSV
http://archives.esac.esa.int/psa/ftp/MARS-EXPRESS/ASPERA-3/MEX-M-ASPERA3-2-EDR-IMA-EXT7-V1.0/DATA/IMA_EDR_L1B_2019_01/IMA_AZ0020190011529C_ACCS01.LBL
Thanks!