benchmark-urbanism / cityseer-examples

Cityseer example notebooks
2 stars 1 forks source link

There are some issues with osm_to_cityseer. #1

Open GISer2000 opened 1 week ago

GISer2000 commented 1 week ago

Referring to the osm_to_cityseer.ipynb file to convert osmnx data to cityseer, using the graphs.nx_decompose method reports an error:

INFO:cityseer.tools.graphs:Decomposing graph to maximum edge lengths of 25.
  0%|▍                                                                             | 20/4112 [00:00<00:01, 2194.25it/s]
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Cell In[20], line 2
      1 # 分解进行更高分辨率的分析
----> 2 G_decomp = graphs.nx_decompose(multi_graph_cons, 25)
      3 # G_decomp = graphs.nx_decompose(G5, 25)

File D:\Anaconda3\envs\Accessibility\Lib\site-packages\cityseer\tools\graphs.py:1292, in nx_decompose(nx_multigraph, decompose_max)
   1288 new_nd_name, is_dupe = util.add_node(
   1289     g_multi_copy, [start_nd_key, sub_node_counter, end_nd_key], x=x, y=y  # type:ignore
   1290 )
   1291 if is_dupe:
-> 1292     raise ValueError(
   1293         f"Attempted to add a duplicate node at x: {x}, y:{y}. "
   1294         f"Check for existence of duplicate edges in the vicinity of {start_nd_key}-{end_nd_key}."
   1295     )
   1296 sub_node_counter += 1
   1297 # add and set live property if present in parent graph

ValueError: Attempted to add a duplicate node at x: 698029.6776641232, y:5711950.710355544. Check for existence of duplicate edges in the vicinity of 265-1668.

But download OSM graphics directly from cityseer. Then use the graphs.nx_decompose method with no problem.

songololo commented 2 days ago

Thanks for reporting, perhaps we can add automatic duplicate node deduplication.

Do you possibly have a code sample that you can share so that I can reproduce the error?

GISer2000 commented 1 day ago

Thanks for reporting, perhaps we can add automatic duplicate node deduplication.

Do you possibly have a code sample that you can share so that I can reproduce the error?

import geopandas as gpd

road = gpd.read_file('data/road.shp')

# Extract start and end node coordinates
start_nodes = road.geometry.apply(lambda geom: list(geom.coords)[0])
end_nodes = road.geometry.apply(lambda geom: list(geom.coords)[-1])

# nodes
# Longitude and latitude of the start and end points are spliced and then de-duplicated.
df_nodes = pd.DataFrame(data={'x': pd.concat([start_nodes,end_nodes]).apply(lambda x: x[0]),'y': pd.concat([start_nodes,end_nodes]).apply(lambda x: x[1])})
df_nodes.drop_duplicates(inplace=True)
df_nodes.reset_index(drop=True, inplace=True)

gdf_nodes = gpd.GeoDataFrame(df_nodes, geometry=gpd.points_from_xy(df_nodes.x,df_nodes.y), crs=4326)
gdf_nodes['osmid'] = gdf_nodes.index
gdf_nodes['point'] = gdf_nodes.geometry.apply(lambda geom: list(geom.coords)[0])

# edges
gdf_edges = road[['geometry']].copy() 
# extract start point coordinates for table link indexes respectively
gdf_edges['start'] = gdf_edges.geometry.apply(lambda geom: list(geom.coords)[0])
gdf_edges['end'] = gdf_edges.geometry.apply(lambda geom: list(geom.coords)[-1])

# get u, v and key
gdf_edges = pd.merge(gdf_edges, gdf_nodes[['osmid','point']].rename(columns={'point':'start'}), how='left', on='start').rename(columns={'osmid':'u'})
gdf_edges = pd.merge(gdf_edges, gdf_nodes[['osmid','point']].rename(columns={'point':'end'}), how='left', on='end').rename(columns={'osmid':'v'})
gdf_edges['key'] = 0

gdf_edges = gdf_edges.drop_duplicates(subset=['u','v','key','start','end'])  # 去重

# unique nodes
unique_nodes = list(set(gdf_edges.u.to_list() + gdf_edges.v.to_list()))
gdf_nodes = gdf_nodes[gdf_nodes['osmid'].isin(unique_nodes)].copy()

# reset index
gdf_nodes.set_index(keys='osmid', inplace=True)
gdf_edges.set_index(keys=['u','v','key'], inplace=True)

# GeoPandas GeoDataFrames to NetworkX MultiDiGraph
nx_road = ox.graph_from_gdfs(gdf_nodes, gdf_edges).to_undirected()

G = io.nx_wgs_to_utm(nx_road)
G = graphs.nx_simple_geoms(G)
G = graphs.nx_remove_filler_nodes(G)
G = graphs.nx_remove_dangling_nodes(G)
G1 = graphs.nx_consolidate_nodes(G, buffer_dist=12, crawl=True)
G2 = graphs.nx_split_opposing_geoms(G1, buffer_dist=15)
G3 = graphs.nx_consolidate_nodes(G2, buffer_dist=15, neighbour_policy="indirect")
G4 = graphs.nx_remove_filler_nodes(G3)
G5 = graphs.nx_iron_edges(G4)

G_decomp = graphs.nx_decompose(G5, 25)  # error