MaxVanDijck / pytorch-ignite-template

An extensible Deep Learning template for research and production
MIT License
4 stars 0 forks source link

Thanks for creating this template ! #1

Open vfdev-5 opened 2 years ago

vfdev-5 commented 2 years ago

@MaxVanDijck thanks for creating this template !

I wonder if it would be also helpful to integrate it into https://code-generator.pytorch-ignite.ai/ , code source ?

MaxVanDijck commented 2 years ago

Would you mean implementing the existing templates i.e: template-text-classification with this template, I imagine this is quite a big task as it most-likely requires frontend work also? Keep in mind this template is also a work-in-progress that I would like to continue developing so might become difficult to maintain both.

I'm happy for this to happen or if we want to take some of the ideas in this template (such as the configuration or project structure) and add these to code-generator. Did you have any thoughts on how you see this panning out?

vfdev-5 commented 2 years ago

Keep in mind this template is also a work-in-progress that I would like to continue developing so might become difficult to maintain both.

That's a good point ! Maybe we could later or put a version of it and sync later.

Would you mean implementing the existing templates

I was thinking about adding the template as it is. It is configured for cifar10, so named as template-hydra-cifar10. Template configurations (in code-gen, tabs with checkboxes etc) could be a bit tricky to figure out, but we could try to make it minimal in the beginning.

I'm happy for this to happen or if we want to take some of the ideas in this template (such as the configuration or project structure) and add these to code-generator. Did you have any thoughts on how you see this panning out?

We wanted to add a template with other config managers (e.g. hydra), so your template is perfect in this sense.

@ydcjeff any ideas/thoughts on how could we move this forward ?

EDIT:

@MaxVanDijck few questions on the structure.

I was not expecting to see engine and model to be tightly related to the dataset.

https://github.com/Lune-AI/pytorch-ignite-template/blob/7a120434ee84ecd965e8c6fedcd7eddb8668f003/config/train.yaml#L18-L19

Engine can be rather generic with classification train_step. As for the model, I agree that cifar10 requires the last layer to output 10 floats, however this maybe could be configurable in the code taking a param from data, like num_classes and constructing the model aware of that. What do you think ?

MaxVanDijck commented 2 years ago

Engine can be rather generic with classification train_step. As for the model, I agree that cifar10 requires the last layer to output 10 floats, however this maybe could be configurable in the code taking a param from data, like num_classes and constructing the model aware of that. What do you think ?

Yep, this is another aspect I intend to make clearer moving forward into the future with a bit more documentation. - model: cifar10.yaml maps to this config file which is used in a hydra.utils.instantiate call. The number of classes or channels could be added in the yaml file like:

_target_: src.models.cnn.CNN
in_channels: 1
num_classes: 100

Hydra documentation for details

Perhaps renaming cifar10.yaml to cnn.yaml might make it clearer. Ideally, if a user wants to add a new model, they can create another .py file in src/models with their nn.module implementation and then create a .yaml file for the model inside config/model. The same can be done for different datasets and engines if desired. I'm always open to suggestions on how to handle this better also 😄

vfdev-5 commented 2 years ago

Yes, this makes perfectly sense. My confusion was about naming essentially