cesarali / graph-bridges

2 stars 0 forks source link

Include Hollow Tranformers #11

Open cesarali opened 1 year ago

cesarali commented 1 year ago

Carnegie Melon Architecture

cesarali commented 1 year ago

Create Config here:

src/graph_bridges/models/backward_rates/backward_rate_config.py

Register Here:

https://github.com/cesarali/graph-bridges/blob/932560659845924138ea389030fc231e010a16a9/src/graph_bridges/models/backward_rates/backward_rate_utils.py#L7

Include In Test Here: tests/backward_rates/test_backward_rates.py

cesarali commented 1 year ago

The temporal hollow transformer as a temporal network and config file was defined in:

https://github.com/cesarali/graph-bridges/blob/a8dfa0f2a98203497a3f73362e392dd678423400/src/graph_bridges/models/networks/transformers/temporal_hollow_transformers.py

The experiments for graphs are defined here:

https://github.com/cesarali/graph-bridges/blob/a8dfa0f2a98203497a3f73362e392dd678423400/scripts/experiments/cesar_prenzlauer_berg/experiment_ctdd_graphs.py

We should try experiments trying different values of:

num_heads=2, num_layers=2, hidden_dim=hidden_dim, ff_hidden_dim=hidden_dim*2, time_embed_dim=128, time_scale_factor=10

as well as:

learning_rate=1e-3, remember to include graphs in :

                                        metrics=["graphs_plots",
                                                 "histograms"]

such as obtaining the orca values etc.

I don't know why the architecture is not really working (the simple MLP is working like a charm), I hope is just a matter of finding the right parameters, but maybe is how the time embeddings are included in the architecture or something more fundamental.

We have to fix this!