issues
search
facebookresearch
/
mae
PyTorch implementation of MAE https//arxiv.org/abs/2111.06377
Other
6.93k
stars
1.17k
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Bug in `random_masking`?
#199
schmidt-ai
closed
4 weeks ago
0
不匹配
#198
ats4869
opened
1 month ago
0
Reconstruction using normalized pixel values to get unnormalized pixel values?
#197
Aakash3101
opened
2 months ago
0
训练的代码用最新的timm跑不通
#196
kevin-Abbring
closed
2 months ago
0
Can run interactive visualization demo with GPU?
#195
HaoqianSong
opened
3 months ago
0
How to obtain the complete reconstructed image?
#194
cestbonsuliu
opened
3 months ago
0
collab notebook error
#193
barbara42
opened
3 months ago
2
Code: Compatible to any channels for function patchify and unpatchify
#192
zhongruiHuangDMRI
closed
4 months ago
2
Two different checkpoints for each ViT type
#191
hussein-jafarinia
closed
2 months ago
5
Is the training procedure result normal? Masked regions do not improve and appear to be random noise.
#190
junzhin
opened
4 months ago
2
Test
#189
jaeofbum
opened
4 months ago
1
Could you provide the pretrained checkpoints of both encoder and decoder in MAE?
#188
tangky22
closed
3 months ago
2
visualization attention map.
#187
kimsekeun
opened
5 months ago
0
model.fc_norm is not trained in linear probing
#186
EmreTaha
closed
4 months ago
0
How to obtain the reconstructed image for inference and masked
#185
hzxie99
opened
6 months ago
0
I found both LLAMA and MAE used smaller beta2 in ADAMW optimizer during pre-training. Is that any intuition behind such setting?
#184
Novestars
opened
7 months ago
1
init MAE for DogClassify
#183
MakoOfficial
closed
7 months ago
1
patchify and unpatchify
#182
tingyushi
opened
8 months ago
1
About the gan-loss
#181
cannonli7
opened
8 months ago
2
Error in loading pretrained weight for 'mae_vit_base_patch16'
#180
nightrain-vampire
opened
8 months ago
2
Loss is considerably worse on custom data set with different mean and standard deviation
#179
bpmsilva
opened
9 months ago
3
Feat/disable distributed
#178
whikwon
closed
9 months ago
1
param_groups_lrd for layer decay
#177
1119736939
opened
9 months ago
1
Confusion in The Loss Function Implementation.
#176
bibhabasumohapatra
opened
10 months ago
0
Ask for segmentation finetune code
#175
LZhangMorilab
opened
11 months ago
0
Is the visualization result normal?
#174
WangYZ1608
opened
11 months ago
2
Small naming error - masking generation
#173
lilygeorgescu
opened
11 months ago
1
Not able to import inf from torch._six
#172
tauruswcc
closed
11 months ago
4
Creative Commons does not recommend their licenses be used for software
#171
rbavery
opened
11 months ago
0
How to reconstruct some unlabeled images
#170
young169
opened
11 months ago
1
Dataset for "Fine-Tuning Pre-trained MAE for classification"
#169
royleung01
opened
11 months ago
0
Poor Image Reconstruction After Fine-tuning on MvTec Dataset
#168
cestbonsuliu
closed
3 months ago
2
Is possible to enable FP16 or TF32 in pretraining?
#167
Wongboo
opened
1 year ago
1
A question about DropPath in pretraining
#166
YangSun22
opened
1 year ago
1
Monitor training of custom dataset
#165
DanielShalam
opened
1 year ago
0
Training time
#164
penguin1109
opened
1 year ago
1
Question regarding figure 5 on the paper
#163
ajboloor
opened
1 year ago
0
making mae efficient
#162
NeeluMadan
closed
1 year ago
1
fix: Prevent multiple print redefinitions
#161
lsc64
opened
1 year ago
2
AssertionError: Input image height (736) doesn't match model (224).
#160
QY1994-0919
opened
1 year ago
1
Single machine multi-GPU training
#159
AlexNmSED
opened
1 year ago
2
License questions
#158
CA4GitHub
opened
1 year ago
1
Question about `len_keep` at random masking function
#157
comojin1994
opened
1 year ago
1
The reconstruction of mae_pretrain_vit_base.pth is awful, is it right?
#156
zheng547
closed
1 year ago
1
Release of MAE decoder
#155
ustcwhy
closed
1 year ago
2
Does the MAE pre-trained model transfer well to CIFAR?
#154
ZK-Zhou
opened
1 year ago
1
[Question] Reconstructed pixels are discontinuous at patch boundaries.
#153
LeroyChou
opened
1 year ago
9
Question about PatchEmbed's Initialization Trick
#152
tae-mo
opened
1 year ago
1
Non-square number of patches
#151
dvd42
opened
1 year ago
0
Implementing VIT Small in MAE
#150
bryanwong17
opened
1 year ago
1
Next