there are at least a few differences that i notice:
prior versions of hypertools normalized data by default. this no longer happens. in the above example, i've specified that the data should be z-scored (within feature) prior to passing to UMAP, but i need to double check that the "preprocessing" is analogous to the prior version. e.g. should 'ZScore' be replaced with 'Normalize' and/or some other preprocessing step?
the first demo uses a Gaussian kernel (variance = 50) to smooth the data, whereas the second example uses a boxcar kernel (width=25). i can't imagine that this would substantially change the results...but you never know...
i don't think that the normalization step gets applied in the old hyp.align function, but it's possible the previous demo normalized the data twice (once prior to the first alignment step and again prior to the second alignment/projecting into 3D steps). if so, the second demo could be updated to normalize multiple times.
it's also possible that something is off with the revised hyperalignment implementation, despite that the alignment tests are passing...
using previous versions of hypertools, the demo "gif" can be reproduced as follows:
A similar approach should work with the revamped version, but it doesn't seem to work well in practice:
there are at least a few differences that i notice:
hyp.align
function, but it's possible the previous demo normalized the data twice (once prior to the first alignment step and again prior to the second alignment/projecting into 3D steps). if so, the second demo could be updated to normalize multiple times.it's also possible that something is off with the revised hyperalignment implementation, despite that the alignment tests are passing...