Open karlnapf opened 9 years ago
List of dimensionality reduction algorithms - PCA SPE(Stochastic proximity embedding) Diffusion Maps Kernel Locally Linear Embedding Kernel Local Tangent Space Alignment Linear Local Tangent Space Alignment Neighborhood Preserving embedding Locality Preserving Projections Locally Linear Embedding (LLE) Hessian Locally Linear Embedding (HLLE) Local Tangent Space Alignment (LTSA) Kernel PCA (kPCA) Multidimensional Scaling (MDS, with possible landmark approximation) Isomap (using Fibonacci Heap Dijkstra for shortest paths) Laplacian Eigenmaps.
Do you think their could be more? I made search for dimensionality reduction in the repo and found these algorithms.
By reproduce common examples do you mean various examples coded in python in scikit learn to be recoded in cpp and put in shogun or anything else?
Hi, thanks for the list! I mean to reproduce standard examples in the notebook -- using Shogun No cpp involved
@karlnapf : I recreated one of the listed tutorials here using SHOGUN (in python). I used the USPS
dataset and created an analog of the scikit tutorial. More specifically on Hessian LLE
given here.
Thing is how do I know if this is correct/accurate. I can't find the SHOGUN analog of recontruction_error
in the Hessian LLE module (Check the scikit page). Let me know If you'd like to take a look at my notebook. I will create more which don't have any examples.
I had to slice the USPS dataset and make it smaller to run(Hope that is okay). The scikit example btw uses a smaller dataset as well.
The reconstruction looks like this, I changed the dimensions from (256, 9000) -> (256, 2048) to make it feasible to run.
The notebook is here
Nice in principle. Can you send a PR then I can give comments. Please always paste a link of the notebook with outputs in the PR. The PR itself should be without outputs. Looking forward to reviewing it
@lisitsyn any ideas on the reconstruction error? If we dont have it, would be good to add that to shogun, it is easy
also related to #2673