garrowat / zenobot

AI proverb generator, trained on 2000-10000 proverbs using a FastAI, PyTorch LSTM model. Converted to run on CPU for inference.
https://zenobot.garrettwatson.io
Apache License 2.0
9 stars 0 forks source link

[Question] within zeno.ipynb, how did you arrive at optimal hyper parameters #1

Closed abhishekms1047 closed 5 years ago

abhishekms1047 commented 5 years ago

I saw that you are passing these hyper parameters for your neural networks . Can you please explain on how you arrived at it and how do you know that it is optimal.

TEXT = data.Field(lower=True, tokenize=list)
bs=64; bptt=8; n_fac=42; n_hidden=128

TEXT3 = data.Field(lower=True, tokenize=list)
bs=64; bptt=8; n_fac=42; n_hidden2=512
  1. Is there any methodology like Bayesian optimization,Grid Search,Genetic algos?
garrowat commented 5 years ago

Hi, sure! I'm quite novice so I'm afraid I can't provide much in the way of methodology.