ETSSmartRes / VAE-NILM

Non-Intrusive Load Monitoring based on VAE model
GNU General Public License v3.0
37 stars 13 forks source link

Poor Reproduction Results of the Paper #11

Open ZhuHouYi opened 3 months ago

ZhuHouYi commented 3 months ago

Dear Author,

I have reimplemented your VAE model using the Disaggregator class from nilmtk_contrib. The experimental hyperparameters are as follows: epoch=10,batch size = 64, window size = 512, learning rate = 3e-4,optimizer=Adam,validation rate = 0.15. The experimental dataset used four months of UK-DALE data for training and two months for testing. The training loss is defined according to your code, including reconstruction loss and KL divergence loss. The checkpoint strategy during training was also adopted from your code, using ModelCheckpoint(monitor="val_mean_absolute_error", mode="min", save_best_only=True) and CustomStopper(monitor='val_loss', mode="auto"). I observed in the TensorFlow command line logs that the losses were decreasing normally (the KL loss approached 0 over time, while the Recon_loss decreased very little).

However, the performance metrics after training were very poor (I trained five devices: Fridge, Kettle, Microwave, Dishwasher, and Washer Machine). Do you have any insights or suggestions for improvement?

Thank you. eb134eedbf1622340691dc9a06c4a840 f5285ea1ed5a1833b02dce3a5f108f98

The model architecture part was completely copied from your code, with only some necessary modifications to the latent dimensions to adapt to the input data with a window size of 512. Additionally, I used Adam with a learning rate of 3e-4 in the model.compile() section. I personally believe this should not have a significant impact on the training process.

a223a89af9c6b2148530dce3f2605b6e cd92a417108bd80819d6f2e824750e28

ZhuHouYi commented 3 months ago

I look forward to your reply. I believe the issue lies with my training strategy, which has led to my inability to reproduce the perfect results presented in your paper. I would greatly appreciate your guidance.😀