gionuno / autoencoder_trees

Autoencoder Trees, based on Ozan Irsoy's paper.
1 stars 0 forks source link

Unsupervised version #1

Open Jerry001 opened 7 years ago

Jerry001 commented 7 years ago

This seems to be a supervised algorithm. Can it be trained in a unsupervised way?

gionuno commented 7 years ago

Wow, I didn't think anyone would see it to be honest... I'll think about it a while, and I'll get back to you if I come up with a solution.

Jerry001 commented 7 years ago

Thank you for the quick response. Quite interesting to see it's performance on the MNIST benchmark dataset.

I guess you are a very experienced Matlab user (with the semicolon at the end of each line :) ). Maybe it's better to remove them for python code.

gionuno commented 7 years ago

Well, I use more C++ and Matlab... I guess it's more of a habit. It started as joke amongst friends, because I programmed so much in those two languages.

Anyways, I think if you want it unsupervised you'll need to run a soft k-means sort of thing over batches, then run a couple of gradient descent steps. But I don't know if that'll work for sure... this code, though, to be honest is a rough outline of the paper. I'm sure if you read Orsoy's paper you'll get a better idea of how it can be modified for unsupervised learning.

Jerry001 commented 7 years ago

Got it. It's difficult to change the habit :)

That's great. From your code base I got to know Orsoy's paper. Thanks. As for conventional auto-encoders, I think we can get the unsupervised version by using the reconstruction errors as the loss.

gionuno commented 7 years ago

Sorry for the wait. You're right, I checked Irsoy's paper, you can do that. Make two of these: one that scales down dimension and another that scales back to the original dimension. The L_2 error is the way to go on this...

gionuno commented 7 years ago

It'd be better to implement on Tensorflow, but I didn't jump on the bandwagon... I implemented these things to show myself (at least) that I understand the ideas behind machine learning. <- Does this sounds snobbish?