vincentherrmann / neural-layout

Use multi-level force layout method to visualize neural networks
https://vincentherrmann.github.io/demos/immersions
MIT License
11 stars 1 forks source link

layout_calculation in 3D? #2

Open ZhuoyueLyu opened 3 years ago

ZhuoyueLyu commented 3 years ago

Hi Vincent, this visualization looks really cool!

I try to visualize my model in 3D using your algorithm, I changed num_dim in network_graph.py to 3 and print out the positions in layout_calculation.py, but the vertices tend to collapse into 2D (the z-axis value tends to be very similar for all vertices as the visualization unfold)

Do you have any suggestions on how to solve it? Thank you in advance!

vincentherrmann commented 3 years ago

Thanks!

To be honest, I did not really test the layout algorithm for more than two dimensions. But in theory it should work. How the dimensions are used depends on the connection patterns in your graph. Here is my intuition: A fully connected neural network has no incentive to use the third dimension. It is more efficient to just spread out in a plane. For a 1D convolutional network, the channels might spread a bit into the third dimension. A 2D convolutional network should construct something similar to the diagrams we often see, with the layers stacked on top of each other.

Maybe this helps, else I can try to see if there is something conceptually wrong with the code for higher dimensions.

ZhuoyueLyu commented 3 years ago

Thank you! yeah, maybe that's the reason:

A fully connected neural network has no incentive to use the third dimension ... For a 1D convolutional network, the channels might spread a bit into the third dimension.

I tested it using a simple fully connected neural net and the resnet18_1d provided in your example_script.py, okay I will think about it, thanks!