While accidentally writing too many fields to a new shapefile, I get a cryptic error message raised by struct.pack:
error: ushort format requires 0 <= number <= USHRT_MAX
Here are the relevant bits of debugging through ipython, using pyshp version 1.2.12:
C:\ProgramData\Anaconda2\lib\site-packages\shapefile.pyc in save(self, target, shp, shx, dbf)
1146 self.saveShx(target)
1147 self.shx.close()
-> 1148 self.saveDbf(target)
1149 self.dbf.close()
1150 if generated:
C:\ProgramData\Anaconda2\lib\site-packages\shapefile.pyc in saveDbf(self, target)
1117 target = os.path.splitext(target)[0] + '.dbf'
1118 self.dbf = self.__getFileObj(target)
-> 1119 self.__dbfHeader()
1120 self.__dbfRecords()
1121
C:\ProgramData\Anaconda2\lib\site-packages\shapefile.pyc in __dbfHeader(self)
803 recordLength = sum([int(field[2]) for field in self.fields]) + 1
804 header = pack('<BBBBLHH20x', version, year, month, day, numRecs,
--> 805 headerLength, recordLength)
806 f.write(header)
807 # Field descriptors
I did something equivalent to:
for n in range(1000):
w.field("field{0}".format(n), "C")
(I can't recall exactly how many, except that it was certainly beyond USHRT_MAX)
Could we have a better exception raised somewhere? Something like ValueError("number of fields ({0}) exceeds the maximum 256".format(headerLength), or whatever is the correct DBF limit. Or possibly, raise an exception when w.field(name, ...) has been called too many times.
While accidentally writing too many fields to a new shapefile, I get a cryptic error message raised by
struct.pack
:Here are the relevant bits of debugging through ipython, using pyshp version 1.2.12:
I did something equivalent to:
(I can't recall exactly how many, except that it was certainly beyond USHRT_MAX)
Could we have a better exception raised somewhere? Something like ValueError("number of fields ({0}) exceeds the maximum 256".format(headerLength), or whatever is the correct DBF limit. Or possibly, raise an exception when
w.field(name, ...)
has been called too many times.