threedi / hhnk-threedi-tools

1 stars 0 forks source link

one-d-two-d check refactor #88

Closed wvangerwen closed 5 months ago

wvangerwen commented 10 months ago

Updated to be compatible with environment in MI 3.34.5 and 3DI toolbox 2.5.5:

d2hydro commented 8 months ago

@wvangerwen; Ik kan de gebroken test technisch fixen (extra filter op numpy array), maar weet niet of dat functioneel gewenst is. Efficiëntste daar samen even naar te kijken Bij de 1e window krijg je nodelevel met allemaal None in:

https://github.com/threedi/hhnk-threedi-tools/blob/ae38640068a2a538811268fb139fb8fb6d0b0a40/hhnk_threedi_tools/core/result_rasters/calculate_raster.py#L156

Ik heb reproductie in dit bestandje staan: https://github.com/threedi/hhnk-threedi-tools/blob/one_d_two_d-refactor/tests/one_d_two_d_exception.py

codecov[bot] commented 8 months ago

Codecov Report

All modified and coverable lines are covered by tests :white_check_mark:

Project coverage is 41.43%. Comparing base (01af596) to head (e17fa6a).

:exclamation: Current head e17fa6a differs from pull request most recent head 026dd2f. Consider uploading reports for the commit 026dd2f to get more accurate results

Additional details and impacted files ```diff @@ Coverage Diff @@ ## main #88 +/- ## ========================================== - Coverage 42.17% 41.43% -0.74% ========================================== Files 52 51 -1 Lines 6571 6432 -139 ========================================== - Hits 2771 2665 -106 + Misses 3800 3767 -33 ```

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

DanielTollenaar commented 5 months ago

Updated env results in following pytest Exceptions: image

Once we fix the first append-issue, we bump into an reindex issue: image

DanielTollenaar commented 5 months ago

ValueError: cannot reindex on an axis with duplicate labels is raised by:

all_manholes.update(intersections.set_index(node_id_col)[type_col])

intersections.set_index(node_id_col)[type_col] does not have an unique index: image

intersections.drop_duplicates(subset=[node_id_col, type_col]).set_index(node_id_col)[type_col] does: image

DanielTollenaar commented 5 months ago

@wvangerwen, could you have a look at test_run_depth_at_timesteps. This is returning an array with None:

see: delaunay:

def delaunay(self):
    """
    Return a (delaunay, s1) tuple.

    `delaunay` is a qhull.Delaunay object, and `s1` is an array of
    waterlevels for the corresponding delaunay vertices.
    """
    try:
        return self.cache[self.DELAUNAY]
    except KeyError:
        points_grid = np.array(self.grid_gdf.centroid.apply(lambda x: [x.x, x.y]).to_list())

        # reorder a la lizard
        points_grid, wlvl = morton.reorder(points_grid, self.wlvl_raw)
        delaunay = qhull.Delaunay(points_grid)
        self.cache[self.DELAUNAY] = delaunay, wlvl
        return delaunay, wlvl

This causes in block_calculate_delauney_wlvl an Exception as nodelevel as an array with None: delaunay, wlvl = self.delaunay simplices = delaunay.find_simplex(points_mesh)

    # determine which points will use interpolation
    in_gridcell = nodeid_arr != 0
    in_triangle = simplices != -1
    in_interpol = in_gridcell & in_triangle
    points_int = points_mesh[in_interpol]

    # get the nodes and the transform for the corresponding triangles
    transform = delaunay.transform[simplices[in_interpol]]
    vertices = delaunay.vertices[simplices[in_interpol]]

    # calculate weight, see print(spatial.Delaunay.transform.__doc__) and
    # Wikipedia about barycentric coordinates
    weight = np.empty(vertices.shape)
    weight[:, :2] = np.sum(transform[:, :2] * (points_int - transform[:, 2])[:, np.newaxis], 2)
    weight[:, 2] = 1 - weight[:, 0] - weight[:, 1]

    # set weight to zero when for inactive nodes
    nodelevel = wlvl[vertices]
    weight[nodelevel == NO_DATA_VALUE] = 0

    # determine the sum of weights per result cell
    weight_sum = weight.sum(axis=1)

    # further subselect points suitable for interpolation
    suitable = weight_sum > 0.5
    weight = weight[suitable] / weight_sum[suitable][:, np.newaxis]
    nodelevel = nodelevel[suitable]

    # combine weight and nodelevel into result
    in_interpol_and_suitable = in_interpol.copy()
    in_interpol_and_suitable[in_interpol] &= suitable
    level[in_interpol_and_suitable] = np.sum(weight * nodelevel, axis=1)

image

And while you're on it, maybe check this deprecation-warning as well:

tests/test_one_d_two_d.py::TestOneDTwoD::test_run_depth_at_timesteps d:\repositories\hhnk-threedi-tools\hhnk_threedi_tools\core\result_rasters\calculate_raster.py:169: DeprecationWarning: Please use Delaunay from the scipy.spatial namespace, the scipy.spatial.qhull namespace is deprecated. delaunay = qhull.Delaunay(points_grid)

tests/test_one_d_two_d.py::TestOneDTwoD::test_run_depth_at_timesteps d:\repositories\hhnk-threedi-tools\hhnk_threedi_tools\core\result_rasters\calculate_raster.py:258: DeprecationWarning: Delaunay attribute 'vertices' is deprecated in favour of 'simplices' and will be removed in Scipy 1.11.0. wlvl_block = self.block_calculate_delauney_wlvl(window)

tests/test_one_d_two_d.py::TestOneDTwoD::test_run_depth_at_timesteps d:\repositories\hhnk-threedi-tools\hhnk_threedi_tools\core\result_rasters\calculate_raster.py:306: DeprecationWarning: Delaunay attribute 'vertices' is deprecated in favour of 'simplices' and will be removed in Scipy 1.11.0. block_out = self.block_calculate_delauney_wlvl(window)

wvangerwen commented 5 months ago

Solved the import warnings and deprecation warnings with simplices and delauney import.

Cannot replicate error in https://github.com/threedi/hhnk-threedi-tools/pull/88#issuecomment-2081559204. Ignoring for now, we need to revisit this part of the code towards the next release.