Right now cuda is required in a number of places, which is likely fine in practice for training but also means that much of the code can't be used out of the box on any machine without cuda access (such as macbook laptops). A CPU option would be really helpful for testing but also for inference, since a pretrained model can be handled by the cpu even if it'll be pretty slow.
Right now cuda is required in a number of places, which is likely fine in practice for training but also means that much of the code can't be used out of the box on any machine without cuda access (such as macbook laptops). A CPU option would be really helpful for testing but also for inference, since a pretrained model can be handled by the cpu even if it'll be pretty slow.