cynricfu / MAGNN

Metapath Aggregated Graph Neural Network for Heterogeneous Graph Embedding
398 stars 69 forks source link

Last.fm is too slow when training, how can I use a smaller Last.fm? #40

Closed Song-xx closed 1 year ago

Song-xx commented 1 year ago

Last.fm is too slow when training because of its size. It takes about 2 hours for one epoch, which make it almost impossible for me to use gridsearch to get a good or better result. It is a good idea if I can use a smaller Last.fm or a part of it. Is it possible? I tried change num_user = 1892, num_artist = 17632, num_tag = 11945 to smaller values in preprocess_LastFM.ipynb, but some errors occur.

Can someone tell me, is there any easy way to use a smaller Last.fm. thanks a lot.

cynricfu commented 1 year ago

Those numbers (num_user, num_artist, num_tag) are hard-coded and thus cannot be changed. An easy way is to just reduce the size of the training set. You may simply sample a part of the original training links (train_pos_user_artist and train_neg_user_artist) to use for grid search.

Song-xx commented 1 year ago

Those numbers (num_user, num_artist, num_tag) are hard-coded and thus cannot be changed. An easy way is to just reduce the size of the training set. You may simply sample a part of the original training links (train_pos_user_artist and train_neg_user_artist) to use for grid search. thanks a lot, I found that if reduce hidden_dim and num_head, it may be quicker a lot, I will try these methods, thanks