alexbw / adabayes

GNU General Public License v2.0
0 stars 0 forks source link

auto-tuner #1

Open alexbw opened 9 years ago

alexbw commented 9 years ago

How am I going to get Whetlab to tune a net in a single line of code?

JasperSnoek commented 9 years ago

Why are you limited to one line of code?

On Mon, Jan 19, 2015 at 7:24 PM, Alex Wiltschko notifications@github.com wrote:

How am I going to get Whetlab to tune a net in a single line of code?

— Reply to this email directly or view it on GitHub https://github.com/alexbw/adabayes/issues/1.

alexbw commented 9 years ago

Zero lines?

JasperSnoek commented 9 years ago

That would be impressive...

On Mon, Jan 19, 2015 at 7:26 PM, Alex Wiltschko notifications@github.com wrote:

Zero lines?

— Reply to this email directly or view it on GitHub https://github.com/alexbw/adabayes/issues/1#issuecomment-70584766.

alexbw commented 9 years ago

I'm thinking of how to best control the parameters of the net. Looking through nn, it seems that the __init methods instantiate tensors. So, if we don't pass in Whetlab'd parameters at first, Whetlab would have to re-init all of the layers. Could come up with some metatable nonsense that mimics all modules, allows for a parameter-less initialization, and then when Whetlab comes up with a parameter set, passes it on.