pyt-team / challenge-icml-2024

Official repository for the Topological Deep Learning Challenge 2024, organized by TAG-DS & PyT-Team and hosted by GRaM Workshop @ ICML 2024.
https://pyt-team.github.io/packs/challenge.html
MIT License
38 stars 49 forks source link

Mapper Lifting (Graph to Hypergraph) #48

Open hfr1tz3 opened 3 months ago

hfr1tz3 commented 3 months ago

In this PR, we implement a lifting from graphs to hypergraphs via the "Mapper on Graphs" construction as depicted in Figure 30 [1], which can be enriched with the structure of a combinatorial complex. It operates in the following way:

1) Begin with graph $X$ and choice of filter function $g: X^{(0)}\to [a,b]$. 2) The algorithm creates $\mathcal{U}$, a cover of $[a,b]$ based on parameters resolution (number of intervals) and gain (proportion overlap between neighboring intervals). 3) For $U\in \mathcal{U}$, the set of vertices $g^{-1}(U)$ induce a subgraph $X_U$ of $X$; together, these pullback sets cover $X^{(0)}$. 4) For each $U$, "cluster" $X_U$ into its connected components. Each connected component is a hyperedge in the lifted topology. 5) Additionally, each edge in $X^{1}$ is a hyperedge in the lifted topology. This could allow for the enrichment of the hypergraph with the structure of a combinatorial complex, if desired.

The 1-skeleton of the nerve of the resulting cover would give the results of the classic Mapper algorithm for graph simplification with filter function $g$.

By default, the function $g$ is the first projection of the graph Laplacian embedding of the unweighted edge adjacency matrix of $X$, as this may be defined on all graphs and is known to capture topological information. However, we also have alternative filters that can be selected by the user (defined below). Some require that the graph $X$ be enriched with additional features. Users may define other filter functions $g$ dependent on what features their data contains in the form of a lambda function.

On the given Zinc dataset, we observe the following when applying default parameters.

mapper_graph

Additionally, for sake of time we restricted to the case of constructing a hypergraph using 1-dimensional filters. However, this implementation can be extended for higher dimensional filters to lift graph to a larger hypergraph (or simplicial complex if we construct the nerve of the pullback).


The example filter functions $g$ which are implemented are the following:

  1. "laplacian" : Converts data to an undirected graph and then applies the torch_geometric.transforms.AddLaplacianEigenvectorPE(k=1) transform and projects onto the smallest nonzero eigenvector.

  2. "svd" : Applies the torch_geometric.transforms.SVDFeatureReduction(out_channels=1) transform to the node feature matrix (ie. torch_geometric.Data.data.x) to project data to a 1-dimensional subspace.

  3. "feature_pca" : Applies torch.pca_lowrank(q=1) transform to node feature matrix (ie. torch_geometric.Data.data.x) and then projects to the 1st principal component.

  4. "position_pca" : Applies torch.pca_lowrank(q=1) transform to node position matrix (ie. torch_geometric.Data.data.pos) and then projects to the 1st principal component.

  5. "feature_sum" : Applies torch.sum(dim=1) to the node feature matrix in the graph (ie. torch_geometric.Data.data.x).

  6. "position_sum" : Applies torch.sum(dim=1) to the node position matrix in the graph (ie. torch_geometric.Data.data.pos).

You may also construct your own filter_attr and filter_func:

  1. "my_filter_attr" : my_filter_func = lambda data : my_filter_func(data) where my_filter_func(data) outputs a (n_sample, 1) Tensor.

Additionally, when calling the transform, set filter_attr = "my_filter_attr" filter_func = my_filter_func


[1] Hajij, M., Zamzmi, G., Papamarkou, T., Miolane, N., Guzmán-Sáenz, A., Ramamurthy, K. N., et al. (2022). Topological deep learning: Going beyond graph data. arXiv preprint arXiv:2206.00606.

This is joint work between Halley Fritze and Marissa Masden.

review-notebook-app[bot] commented 3 months ago

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

hfr1tz3 commented 3 months ago

I'm just making a comment to link all collaborators to the PR: Marissa Masden: @mmasden Halley Fritze: @hfr1tz3

Additionally, for sake of time we restricted to the case of constructing a hypergraph using 1-dimensional filters. However, this implementation can be extended for higher dimensional filters to lift graph to a larger hypergraph (or simplicial complex if we construct the nerve of the pullback).

gbg141 commented 3 months ago

Hello @hfr1tz3! Thank you for your submission. As we near the end of the challenge, I am collecting participant info for the purpose of selecting and announcing winners. Please email me (or have one member of your team email me) at guillermo_bernardez@ucsb.edu so I can share access to the voting form. In your email, please include:

Before July 12, make sure that your submission respects all Submission Requirements laid out on the challenge page. Any submission that fails to meet this criteria will be automatically disqualified.