Open stephanienguyen2020 opened 1 year ago
Thanks for asking. I will get back to you with the answer to item 1 later. For now, I highly recommend GPU to be used for BEDwARS as it takes too long to run on CPU depending on the size of your data and the number of random walk steps. I have not used Google Collab. Could you please elaborate on the issue you are facing? If you can store the files generated by BEDwARS on Google Collab then you can easily run inference.py on it.
Hi, I'm using a Mac M1 laptop, which doesn't have GPU. In Curriculum.py and Inference.py, there are snippets of the code that prompt users to import cuda/cupy, which requires GPU, which my laptop doesn't support. I have come up with a few solutions:
1. Modified the code so it no longer requires CUDA/CUPY. For example, here was what I changed for Inference.py: dev = "cpu" # Force CPU usage device = torch.device(dev) tch_dtype = torch.float32 use_cupy_sum = True
if use_cupy_sum: from torch.utils.dlpack import to_dlpack from torch.utils.dlpack import from_dlpack
else: torch_sum = torch.sum
np.set_printoptions(precision=2, suppress=True)
However, I still couldn't run the program, my log-infer documented this:
Thank you for your help!