EnsemblGSOC / Ensembl-Repeat-Identification

A Deep Learning repository for predicting the location and type of repeat sequence in genome.
4 stars 3 forks source link

organize hyperparameters #35

Closed williamstark01 closed 1 year ago

williamstark01 commented 2 years ago

Converting tunable variables of the network into hyperparameters will help us finetune their values more easily.

Using an experiment configuration file is an option here, as is adding all hyperparameters as arguments to the training script.

Take a look at how we are doing the former for another project: https://github.com/Ensembl/gene_pcp/blob/main/transformer_pipeline.py#L411 https://github.com/Ensembl/gene_pcp/blob/main/transformer_configuration.yaml

yangtcai commented 2 years ago

Yes, I will try it!!! :D

williamstark01 commented 2 years ago

Some more Transformer parameters can be converted to hyperparameters to speed up experimenting with different values:

        d_model,
        nhead=8,
        num_encoder_layers=6,
        num_decoder_layers=6,
        dim_feedforward=2048,
        dropout=0.1,
williamstark01 commented 1 year ago

done in #41