I tried to run CAWN on reddit dataset. The command I tried is:
python main.py -d reddit --pos_dim 108 --bs 100 --n_degree 32 1 1 --mode t --bias 1e-8 --pos_enc lp --walk_pool sum --gpu 1
I found that the memory cost of the training process continues to increase. Why is that?
I tried to run CAWN on reddit dataset. The command I tried is: python main.py -d reddit --pos_dim 108 --bs 100 --n_degree 32 1 1 --mode t --bias 1e-8 --pos_enc lp --walk_pool sum --gpu 1 I found that the memory cost of the training process continues to increase. Why is that?