Suzuki-Yonekura-Lab / VAEGAN-motor

VAEGANを用いたモーターの生成。アイシンとの共同研究
0 stars 0 forks source link

記録 #1

Open TamuraMasayuki opened 1 year ago

TamuraMasayuki commented 1 year ago

ToDo

TamuraMasayuki commented 1 year ago

Encoder Lossが高止まりしてしまう

TamuraMasayuki commented 1 year ago

卒論の方向性

TamuraMasayuki commented 1 year ago

ToDo 複数トポロジーの形状生成

TamuraMasayuki commented 1 year ago

2023/5/22

TamuraMasayuki commented 1 year ago
TamuraMasayuki commented 1 year ago

2023/6/13

TamuraMasayuki commented 1 year ago

2023/06/14

TamuraMasayuki commented 1 year ago

2023/7/11

やったこと

ラベルにトルクと面積の両方を入れて学習

TamuraMasayuki commented 1 year ago
TamuraMasayuki commented 1 year ago

von Mises Fischer分布

TamuraMasayuki commented 1 year ago

信頼性の評価

質問

TamuraMasayuki commented 1 year ago

オンライン学習

TamuraMasayuki commented 1 year ago

2023/8/1

TamuraMasayuki commented 1 year ago

ロス

discriminator

encoder

decoder

TamuraMasayuki commented 1 year ago

2023/8/1

TamuraMasayuki commented 1 year ago

dropout追加時のエラー対応

        # labelt_dim + class_num
        self.model = nn.Sequential(
            *block(latent_dim+CLASS_NUM, 64, normalize=False),
            *block(64, 128, dropout=0.2),
            *block(128, 256, dropout=0.2),
            *block(256, 512, dropout=0.2),
            *block(512, 1024, dropout=0.2),
            nn.Linear(1024, coord_size),
            nn.Tanh()
        )

decoder.load_state_dict(torch.load(G_PATH, map_location=torch.device('cpu')))

RuntimeError: Error(s) in loading state_dict for Decoder:
    Missing key(s) in state_dict: "model.7.weight", "model.7.bias", "model.7.running_mean", "model.7.running_var", "model.10.weight", "model.10.bias", "model.11.running_mean", "model.11.running_var", "model.15.weight", "model.15.bias", "model.15.running_mean", "model.15.running_var", "model.18.weight", "model.18.bias". 
    Unexpected key(s) in state_dict: "model.5.weight", "model.5.bias", "model.6.running_mean", "model.6.running_var", "model.6.num_batches_tracked", "model.8.weight", "model.8.bias", "model.9.weight", "model.9.bias", "model.9.running_mean", "model.9.running_var", "model.9.num_batches_tracked", "model.12.weight", "model.12.bias", "model.12.running_mean", "model.12.running_var", "model.12.num_batches_tracked". 
    size mismatch for model.6.weight: copying a param with shape torch.Size([256]) from checkpoint, the shape in current model is torch.Size([256, 128]).
    size mismatch for model.11.weight: copying a param with shape torch.Size([1024, 512]) from checkpoint, the shape in current model is torch.Size([512]).
    size mismatch for model.11.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
    size mismatch for model.14.weight: copying a param with shape torch.Size([292, 1024]) from checkpoint, the shape in current model is torch.Size([1024, 512]).
    size mismatch for model.14.bias: copying a param with shape torch.Size([292]) from checkpoint, the shape in current model is torch.Size([1024]).
TamuraMasayuki commented 11 months ago

2023/10/28

質問