Closed marcrocasalonso closed 2 years ago
Dear bghojogh,
I have compared the results of your isomap implementation with respect the sklearn Isomap and the embedding space that gives is different. I have coded the sklearn isomap algorithm with the same Knn parameters and variables with respect your own isomap implementation. Do you know which can be the cause?
Thank you in advance, Best regards, Marc
Hi dear Marc. Sorry for some delay in my response because I was busy these days. I checked the implementation of sklearn's Isomap and found out the reason of difference.
If you see the implementations as:
Both implementations construct the geodesic distance matrices, denoted by D. skleran's implementation uses D^2 but my implementation uses D; although, this does not make much difference. The main difference is in eigenvalue decomposition of the kernel (K = -0.5 H D H where H is the centering matrix). In my implementation, I use np.linalg.eigh() for eigenvalue decomposition. However, sklearn passes the kernel to its kernel_pca() class, which is: https://github.com/scikit-learn/scikit-learn/blob/7389dbac82d362f296dc2746f10e43ffa1615660/sklearn/decomposition/kernel_pca.py#L18 In that class, it uses several options of eigenvalue decomposition: https://github.com/scikit-learn/scikit-learn/blob/7389dbac82d362f296dc2746f10e43ffa1615660/sklearn/decomposition/kernel_pca.py#L192 Hence, various implementations of eigenvalue decompositions can yield slightly different results.
Thank you very much for the explanation and your time to investigate what is sklearn doing compared with your code.
Best regards, Marc
Dear bghojogh,
I have compared the results of your isomap implementation with respect the sklearn Isomap and the embedding space that gives is different. I have coded the sklearn isomap algorithm with the same Knn parameters and variables with respect your own isomap implementation. Do you know which can be the cause?
Thank you in advance, Best regards, Marc