Closed wingillis closed 1 year ago
Added capabilities of original Robust arhmm formulation in MoSeq to jax-moseq. Utilizing the GPU and jit compilation speeds up model training by 5-10x.
Woot! We really should have written this model up somewhere. Did it appear in the methods of the 2018 or 2023 papers?
I don't think it did appear in the methods - we totally should write it up!
Added capabilities of original Robust arhmm formulation in MoSeq to jax-moseq. Utilizing the GPU and jit compilation speeds up model training by 5-10x.