pygod-team / pygod

A Python Library for Graph Outlier Detection (Anomaly Detection)
https://pygod.org
BSD 2-Clause "Simplified" License
1.31k stars 127 forks source link

The model is not using GPU when I run benchmark/main.py. #91

Closed functionendless closed 10 months ago

functionendless commented 10 months ago

Hi, When I was running benchmark/main.py using --gpu parameter, I found that there is almost no occupation in gpu, and the cpu load is as much as --gpu -1.

Then I tried to move the data to gpu by data.to(device=validate_device(gpu_id=args.gpu)) and different devices errors occurred. After that I found out that because of embedded model structure and temporary defination, most of the parameters remained in cpu while the outer model's device is gpu.

For example, in anomalous' fit method(pygod/detector/anomalous.py), self.model = ANOMALOUSBase(w_init, r_init) makes self.model on cpu, which means loss and its backward calculation are all cpu.

Moreover, the different writings on the same thing at pygod/detector/radar.py line 119-123 and pygod/detector/anomalous.py line 123-127 also makes me confused.

w_init = torch.eye(x.shape[0]).to(self.device)
        r_init = torch.inverse((1 + self.weight_decay) *
            torch.eye(x.shape[0]).to(self.device) + self.gamma * laplacian) @ x

        return x, s, laplacian, w_init, r_init
w_init = torch.randn_like(x.T)
        r_init = torch.inverse((1 + self.weight_decay)
            * torch.eye(x.shape[0]) + self.gamma * laplacian) @ x

        return x, s, laplacian, w_init, r_init

I'm wondering how to take usage of GPU using pygod and looking forward to the reply.

kayzliu commented 10 months ago

Thank you for pointing it out! We gonna fix it soon.

kayzliu commented 10 months ago

We have fixed the bug. You can now build from source to circumvent this issue.