Open benjamin-kroeger opened 1 year ago
Hi - not sure why this is the case. We can look into this, although it may take a while to allocate time. But if you find a solution, feel free to submit a PR.
On Thu, Jul 27, 2023 at 2:53 AM Benjamin Kröger @.***> wrote:
The load_model method currently doesn't pass the device to the DeepBLAST model, forcing it to always run in cuda / gpu model. With the current implementation it is not possible to run on the cpu, using top level functions, which is a shame since the code for running of the cpu exists. Could you maybe fix that ?
— Reply to this email directly, view it on GitHub https://github.com/flatironinstitute/deepblast/issues/137, or unsubscribe https://github.com/notifications/unsubscribe-auth/AA75VXKMCWIIIWPKRBJ7CGLXSIF7ZANCNFSM6AAAAAA2ZTTA4E . You are receiving this because you are subscribed to this thread.Message ID: @.***>
The
load_model
method currently doesn't pass the device to the DeepBLAST model, forcing it to always run incuda
/gpu
model. With the current implementation it is not possible to run on the cpu, using top level functions, which is a shame since the code for running of the cpu exists. Could you maybe fix that ?