SuperDARN / pydarn

Python library for visualizing SuperDARN Data
GNU Lesser General Public License v3.0
31 stars 11 forks source link

pydarn.SuperDARNRead errors reading .map file #354

Closed Yas979 closed 1 year ago

Yas979 commented 1 year ago

BUG

Please can I check if anyone can help with the following error message? I am attempting to create map plots of the convection electric field, using pydarn.SuperDARNRead to extract the map data from the .map file. This has worked in earlier versions of pydarn. However, the latest version of PyDarn results in the following error:

mapfile='/data/sol-ionosphere/sd_map_potential/fitacf3_allrad/north/map/2011/20110101.fit.map' SDarn_read = pydarn.SuperDARNRead(mapfile) map_data = SDarn_read.read_map()

Error: /data/sol-ionosphere/sd_map_potential/fitacf3_allrad/north/map/2011/20110101.fit.map contains an block size 540160288 > remaining bytes 69250442 at record = 0. Traceback (most recent call last): File "", line 1, in File "/alice-home/1/y/ys378/alice3_python_env/env/lib64/python3.9/site-packages/pydarnio/dmap/superdarn.py", line 517, in read_map self._read_darn_records(file_struct_list) File "/alice-home/1/y/ys378/alice3_python_env/env/lib64/python3.9/site-packages/pydarnio/dmap/superdarn.py", line 393, in _read_darn_records self._read_darn_record(format_fields, optional_list) File "/alice-home/1/y/ys378/alice3_python_env/env/lib64/python3.9/site-packages/pydarnio/dmap/superdarn.py", line 359, in _read_darn_record record = self.read_record() File "/alice-home/1/y/ys378/alice3_python_env/env/lib64/python3.9/site-packages/pydarnio/dmap/dmap.py", line 449, in read_record self.bytes_check(block_size, "block size", File "/alice-home/1/y/ys378/alice3_python_env/env/lib64/python3.9/site-packages/pydarnio/dmap/dmap.py", line 286, in bytes_check raise dmap_exceptions.MismatchByteError(self.dmap_file, pydarnio.exceptions.dmap_exceptions.MismatchByteError: Error: /data/sol-ionosphere/sd_map_potential/fitacf3_allrad/north/map/2011/20110101.fit.map contains an block size 540160288 > remaining bytes 69250442 at record = 0.

Priority

Information

Python 3.9.16: Rocky Linux 9.2 (Blue Onyx): matplotlib version 3.7.2:

Example of the bug

See above

Attempts

What have you tried already to fix it? No

What have you tried to get around it? No

Data Location

File is ~68 MB and can share over a file share service

Potential Bug Location

Please note this is not required. If you can link to some code in pyDARN where you think the bug is coming from this will help the developer fix it sooner.

Potential Solution(s)

Please note this is not required. If you have some idea of how to fix/improve on the bug then please describe here for possible solution(s).

Extra Notes

Please provide other pertinent details about this feature:

carleyjmartin commented 1 year ago

Hi @Yas979, it looks like this will be an issue for the io package:pydarnio (which I would reply to issues over there anyway) over on the https://github.com/SuperDARN/pyDARNio repo.

If you could email/transfer me the file you are trying to open at carley.martin@usask.ca? It looks like the file itself doesn't fit the format of what pydarnio is expecting in read_map for a map file, have you tried to dmapdump using RST to check the contents of the file (don't worry if not it'll just be the first thing I do when I have the file)?

You could also try read_dmap() or read_records() instead of read_map() but I suspect you might still get the same error - let me know if that is the case if you give that a go!

Also could you let me know what version of pydarn and pydarnio you have installed, you can find it using pip3 list. Updating pydarn does not always update the pydarnio version, but it should be >1.1.0, we haven't really updated the lines causing the error for quite a few years, so if you can read this file using an older version of either pydarn/pydarnio, is it possible to let me know which version so we can track back and find out?

Thanks, Carley

I will just close this issue and open a copy over on pydarnio, just in case anyone goes looking for the same error over there, you can link through to the new issue on pydarnio below.