Closed alamsaqib closed 5 years ago
May I ask which dataset you are running?
Thanks for the early reply, I am using R8 and R52 under SGC\downstream\TextSGC
This suggests that you don't have enough CPU memory. I can try to upload a preprocessed data so that it may take less memory.
However, in general this takes a couple of GBs memory. how many CPU memory are you using?
Thanks a lot for replying. It would be great to upload less memory consumed preprocessed data. I am using 8 GB of CPU memory.
I tested these scripts again. It occurs to me that the preprocessing uses much less than 8GB memory. I suspect that you might be also running other memory-intensive jobs on your machine.
That being said, I'll work on releasing a preprocessed data today.
Thank you so much for early replying and really appreciate it.
See https://github.com/Tiiiger/SGC/tree/master/downstream/TextSGC. Thank you for voicing this concern out. Closing for now. But please let me know if this doesn't work for you.
Thank you Tiiiger for making up the changes, but it still not working for me, I am posting the error message here.
Traceback (most recent call last):
File "C:\Users\Desktop\NewProjects\SGC\downstream\TextSGC\train.py", line 105, in
While running train.py from downstream/TextSGC a Memory Error occurs. "SGC\downstream\TextSGC\utils.py", line 177, in sparse_to_torch_dense dense = sparse.todense().astype(np.int32)"