emanjavacas / pie

A fully-fledge PyTorch package for Morphological Analysis, tailored to morphologically rich and historical languages.
MIT License
22 stars 10 forks source link

Feature/optuna #67

Open PonteIneptique opened 4 years ago

PonteIneptique commented 4 years ago

It's a work in progress for now, I'll need to implement multi-GPUs but I have not access to it right now.

PonteIneptique commented 4 years ago

Rather than issues, I'd be looking for feedback on my implementation :) There is some stuff I still have to do but this looks promising.

emanjavacas commented 4 years ago

Hey. Nice you got it wrapped up. I have two comments on this.

PonteIneptique commented 4 years ago

Regarding 1., I agree but I am unsure about the way forward. Technically, I reused as much as I could from the Trainer class. Most of the duplicate code directly comes from the train script. Maybe the Trainer class could have a setup(settings) method though, which would even more reduce the need for duplication around.

As for 2., I technically deal with nested using path ("lr/patience"). I did not include example though. I'll look at your implementation for this.

PonteIneptique commented 4 years ago

I think something along the following line would be neat:

trainer, trainset, devset, encoder, models = Trainer.setup(settings)
PonteIneptique commented 4 years ago

I reworked the API to go towards a more unified training API. If you agree with it, I'll apply it to other scripts.

emanjavacas commented 4 years ago

Actually better to use "devices" because it wouldn't have to be a cuda device after all.

On Sat, Jun 27, 2020 at 5:44 PM Thibault Clérice notifications@github.com wrote:

@PonteIneptique commented on this pull request.

In pie/scripts/tune.py https://github.com/emanjavacas/pie/pull/67#discussion_r446539022:

+def create_tuna_optimization(trial: optuna.Trial, fn: str, name: str, value: List[Any]):

  • """ Generate tuna value generator
  • Might use self one day, so...
  • :param trial:
  • :param fn:
  • :param name:
  • :param value:
  • :return:
  • """
  • return getattr(trial, fn)(name, *value)
  • +class Optimizer(object):

  • def init(self, settings, optimization_settings: List[Dict[str, Any]], gpus: List[int] = []):

I can definitely use cudas (plural).

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/emanjavacas/pie/pull/67#discussion_r446539022, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABPIPIZWJEPD5WDCPMKKSZTRYYHWJANCNFSM4OJE4R5Q .

-- Enrique Manjavacas

PonteIneptique commented 4 years ago

Thanks for the review ;)

PonteIneptique commented 4 years ago

Ok, this should be again seen by you. I addressed all your concerns (I think). This should be neat :)

PonteIneptique commented 4 years ago

All in all, I think this is one is ready