Open etiennesky opened 13 years ago
PDS: I'm thinking this could actually be combined into the "poNetcdfSRS" array, which already maps from GDAL proj names to NetCDF ones. We could add an attrib to that struct to map to CF-1 projections maps too.
ET: basically the specific cases have to loop over the mapped variables, how can we apply this principle to the generic case? In the case that a mapped attribute is not found, we do not write that attribute (vs. the specific case, where the default value is written) so it's just an if clause (don't write the attribute, vs. write default)
ET: we should definitely save the wkt (ifneeded option) so we can get the full crs info, as it may be lacking some info using the approach I suggested again we may find that the looping needs to be reversed in the generic case, so that we don't loose anything.
Just a node on how this is done for other projection remappings in other parts of the GDAL code:
So there doesn't seem to be a recommended standard way of handling this throughout all of GDAL. It seems the NetCDF driver was trying to use a much simplified single mapping table, which none of the others use, hence why it was falling down.
mostly done, needs some error checking
review notes as requested:
Can either send patch or commit these through.
A small follow-up to the above: before merging in the code to svn, I suggest we also remove the big comment block at the end of netcdfdataset.cpp, listing all the CF-1 projection attributes.
This is really obsoleted by the new mapping tables which explicitly list these mappings ... and in any case adding an URL reference to CF-1 webpage is probably better practice.
On export:
yes, it should. however, when the driver calls the appropriate Set
Regarding the above: The issue is happening when reading from an erroneous HFA file (missing the scale_factor), and then writing to a new NetCDF file. So for the NetCDF driver to prevent it, I think we'd need to update our export code to use default values for each parameter if not found.
IE we'd need to update the oNetcdfSRS_PP struct to include a defval (currently commented out), and then fill in one of these for each parameter.
It is probably good to be 'defensive' like this, but then again the error is being caused by the ESRI SRS conversion code.
Perhaps we should create an issue to mark this as an 'upgrade'? Given need to upgrade Create() code etc.
ok I see what you mean. It would make sense to add the default value when none exists. however, it would be a pain to have to add that to each and every mapping. Can we get away with either
1) put these values in the generic mappings only, and look that up. 2) just hard-code it in the code (I recall that only one value - scale factor is different from 0, right?). It's not like this is going to change anytime soon.
Re Mercator 2SP export issue raised by email and on issue 12: 1) GeoTiff unfortunately doesn't support Mercator 2SP (See http://www.remotesensing.org/geotiff/proj_list/mercator_2sp.html and notice GeoTiff isn't in the 'supported by' list) 2) So for testing of this projection I updated my testing script to allow selecting a different intermediate format for the tests (for the gdalwarp stage, before transform to netCDF). I've been using HFA for this, which has some issues of its own although Mercator 2SP seems to be fine. This is what I get after converting via HFA:
{{{ mercator#false_easting=0 mercator#false_northing=0 mercator#grid_mapping_name=mercator mercator#inverse_flattening=298.257223563 mercator#longitude_of_prime_meridian=0 mercator#longitude_of_projection_origin=145 mercator#semi_major_axis=6378137 mercator#spatial_ref=PROJCS["Mercator_2SP",GEOGCS["GCS_WGS_1984",DATUM["WGS_1984",SPHEROID["WGS_84",6378137,298.257223563]],PRIMEM["Greenwich",0],UNIT["Degree",0.017453292519943295]],PROJECTION["Mercator_2SP"],PARAMETER["standard_parallel_1",-37],PARAMETER["central_meridian",145],PARAMETER["false_easting",0],PARAMETER["false_northing",0],UNIT["Meter",1]] mercator#standard_parallel=-37 }}}
This suggests overall testing strategy (that I came up with originally) is showing its limits ... really once we can warp directly to NetCDF this will eliminate issues with other formats.
I think your stategy is ok. Your fix looks good (CF conformant).
Can you please add these and any other changes (when ready) to autotest/gdrivers/netcdf_cy.py in the -proj branch?
As discussed in Google chat. Aim for:
Send note to ET for comment when done.