my GPU is A6000 48G, RAM is 128GB,but i wait so long time and res is:
Using Transformer Model: FAST_Transformer_joint Baseline Quant Single !!
Using Single Quantizer !!!! Reduce 4 High Reso 256 Relu Quantize 256 Symm 1
FAST_transformer_builder_baseline No drop Quant single cond class
z emb shape torch.Size([8192, 3072])
Loading Stage2 Fail !!!!!!
Current best validation metric (iou): -inf
Total number of parameters: 3670362821
output path: output/PR256_ED512_EN8192/class-guide/transformer3072_24_32
100%|███████████████████████████████████████| 1024/1024 [32:10<00:00, 1.88s/it]
100%|███████████████████████████████████████| 1024/1024 [32:04<00:00, 1.88s/it]
i also get the cube。
i want to know why my stage2 failed!
my GPU is A6000 48G, RAM is 128GB,but i wait so long time and res is:
i also get the cube。 i want to know why my stage2 failed!