lmjohns3 / theanets

Neural network toolkit for Python
http://theanets.rtfd.org
MIT License
328 stars 73 forks source link

Changed optimize args to be passed as a list, rather than a string split ... #2

Closed kastnerkyle closed 10 years ago

kastnerkyle commented 11 years ago

...by '+'

kastnerkyle commented 11 years ago

If we change the type of --optimize to list, the existing way of passing just "sgd" or just "hf" will be read as ["s","g","d"] or ["h", "f"] - I am thinking it would be better to leave the type off of --optimize, then check for type with isinstance. That way, the commandline approach will only be able to run single trainers, though...

Maybe it is better to leave it as "+" and just live with that on the programmatic side? I am not seeing a great way to unify the two techniques without losing some existing capability or changing the commandline API.

I agree about removing the call to self.args.trainer_args if we go this route.

lmjohns3 commented 11 years ago

Oh, sorry, I just meant add nargs='+' to the command-line argument.

kastnerkyle commented 11 years ago

Ah, that makes more sense to me - I didn't think of that. I will work on it.

lmjohns3 commented 11 years ago

I just checked in a change that I think will fix up this issue -- it makes the Experiment have a list of trainers, and adds a method (add_trainer) that can be used to programmatically add new trainers (with their own kwargs). (I also made the --optimize flag take a list of arguments on the command line.)

Let me know if this will solve the problem !

kastnerkyle commented 11 years ago

I will check it out this evening - but it looks pretty good just from eyeballing it. Awesome

kastnerkyle commented 10 years ago

I think your fix solved the problem - I will go ahead and close this out. Thanks!