tristandeleu / pytorch-meta

A collection of extensions and data-loaders for few-shot learning & meta-learning in PyTorch
https://tristandeleu.github.io/pytorch-meta/
MIT License
1.97k stars 256 forks source link

Documentation incomplete #68

Open renesax14 opened 4 years ago

renesax14 commented 4 years ago

there are many classes specially in the data loaders that have no docs. This makes it very difficult to use the library.

For example, usually when creating a meta-set one has a number of data-sets. How many data-sets are actually created? Usually in one episode one creates one data-set. Is this how your library does it or how?

Some docs so that it's easier to use would be good.

Comments on episodic learning would be good. Usually I sample a set of meta-tasks (or data-sets from the meta-set) per episode.

tristandeleu commented 4 years ago

Documentation is definitely incomplete, and PRs to add more documentation are welcome!

For example, usually when creating a meta-set one has a number of data-sets. How many data-sets are actually created? Usually in one episode one creates one data-set. Is this how your library does it or how?

In CombinationMetaDataset (the most frequently used MetaDataset), the number of datasets created corresponds to C(number of classes in the dataset, number of ways); the datasets are created by first picking number of ways classes among the pool of possible classes in the dataset, and then images are sampled from these classes.

renesax14 commented 4 years ago

Documentation is definitely incomplete, and PRs to add more documentation are welcome!

For example, usually when creating a meta-set one has a number of data-sets. How many data-sets are actually created? Usually in one episode one creates one data-set. Is this how your library does it or how?

In CombinationMetaDataset (the most frequently used MetaDataset), the number of datasets created corresponds to C(number of classes in the dataset, number of ways); the datasets are created by first picking number of ways classes among the pool of possible classes in the dataset, and then images are sampled from these classes.

Awesome I will make sure to do PR's as I discover more uses for ur stuff. :)

Let me ask my question this way. Is sampling a batch of tasks (say 16) the usual definition of an episode?

A comment on the term episodic learning would be useful to have somewhere in the docs. Perhaps a small section of definitions for terms in meta-leanring would be helpful.

Some I can think of are:

Hope not to sound demanding, your library is already pretty awesome and it's highly appreciated :) ;) "

Hope I can help to make it better :D

tristandeleu commented 4 years ago

Any contribution is welcome, thank you for your help! Having a general definition for these terms would be great indeed, and the docs could use some tutorials to introduce some of these.

Let me ask my question this way. Is sampling a batch of tasks (say 16) the usual definition of an episode?

The way I see it, an episode means the dataset of one single task. Having a batch of tasks is really only there to reduce the variance of the gradient estimate during the outer-loop optimization. When doing episodic learning (inner-loop), you only get to see data from the one task you try to solve, you can't use information from other tasks.