Open abhims opened 10 years ago
I face the same problem as well. The data conversion is guessed incorrectly for decimal point numbers. It seems like the way to overcome this is to use the API to specify the field type directly, but in reality it is not feasible.
In addition, the data preview's Range filter does not work well with numbers (which is after the conversion). Before conversion, it works perfectly.
P.S. I think its a problem with reading values from XLS format. If you were to leave it in CSV format, it works fine and the decimal points will not be truncated.
After going through the push_to_datastore function, I think it might be the fault of messytable.
The push_to_datestore function from the datapusher does not seem to directly manipulate any data itself, thus the suspicion would lie in any 3rd party functions.
See https://github.com/okfn/messytables/issues/92 for a solution. It involves changing 2 lines of code in the types.py file. Thanks to the issue starter.
I have geographic data stored in .xls format. The number fields all get truncated to int when this data is uploaded. Example data
Lattitude, Logitude -27.44314448,127.00122
This data gets truncated to -27 and 127. I tried changing the basic types mapping to double precision, but that does not seem to help. I'm using ckan2.2 on ubuntu with postgres in the development environment [paster, python ]