epfl-ada / ada-2023-project-amonavis

ada-2023-project-amonavis created by GitHub Classroom
0 stars 0 forks source link

Building the GNN #17

Open WhimZig opened 6 months ago

WhimZig commented 6 months ago

I built part of the data preprocessing in the notebook of gnn_machine.ipynb. I haven't finished running it, but please finish running it and add the pickling. This is just so we only have to run it once to get all the data and can then get all the info that we want easily.

The parts that are missing are the following:

For help with these two parts, the info here should help: https://pytorch-geometric.readthedocs.io/en/latest/get_started/introduction.html

An important note for the classifier: The backpropagation and the loss should be the default for classification, nothing fancy there. The really important part is that during the prediction, we need to add an extra step to guarantee that the solution is valid. This is pain.

The shittiest and simplest explanation I could come up with was just to see the classifier as the probabilities that a node is in the path. Then we have a separate system that just iterates through the nodes and only adds them if they're valid. Them being valid meaning adding them if they are a neighbor of the nodes explored so far. Every time a node gets added, go back to the start and check any unchecked nodes again.

The logic behind this is in one of the papers I've linked, don't recall where it is exactly for now.

I want to disappear during the ski weekend, but if I'm needed please message me during the night as that's the only time I can kinda help.

My code should work? I did some tests and it all worked, but I can see some weird shit happening when you try to add stuff to the classes.

codecamaru commented 6 months ago

ok thanks will try to do it ASAP

on another note, carlos and I pushed some versions for the algorithm But I have tried to run again your previous algos and it takes 0 seconds to run! how is this possible? Please someone try it hahha cause it was supposed to be slow but it's fucking fast!