tristandeleu / pytorch-meta

A collection of extensions and data-loaders for few-shot learning & meta-learning in PyTorch
https://tristandeleu.github.io/pytorch-meta/
MIT License
1.97k stars 256 forks source link

Addition of OmniPrint dataset #152

Open RobvanGastel opened 2 years ago

RobvanGastel commented 2 years ago

I added a dataloader for the OmniPrint dataset similar to the Omniglot dataset, does this conform to the pytorch-meta code structure sufficiently? Currently, the dataset is hosted on Google Drive personally to prevent any extra dependencies of Kaggle where it's hosted. The training split for every print split (meta1, meta2, ...) is the same for all splits as used in the OmniPrint source code.