eminorhan / optimized-mae

An optimized implementation of masked autoencoders (MAEs)
MIT License
1 stars 1 forks source link
image-based masked-autoencoder optimized representation-learning self-supervised-learning

Optimized Masked Autoencoders (MAEs)

A lean, optimized implementation of masked autoencoders (MAEs). The skeleton of the code is recycled from Facebook's MAE repository with various simplifications. The following optimizations are implemented:

Dependence of model definitions on the timm library is also removed in this implementation, so the code is self-contained except for the standard libraries. The code was tested with pytorch==2.2.0 and torchvision==0.17.0.

Notes: