Open xegulon opened 3 years ago
Unfortunately this is currently not supported. As a workaround, you could use the prediction mode, which saves the predictions to disk. Maybe I will add this someday, so the model can be directly used for inference in Python, but I cannot make any promises.
Can Spert run on CPU in ubuntu VIRTUAL machine
@ader456 Yes I think so.
Hi, thanks for the amazing work
Is there a snippet somewhere explaining how to efficiently use the model for inference?
Like this:
Thanks a lot!