MingjunZhong / transferNILM

Transfer Learning for Non-Intrusive Load Monitoring
111 stars 42 forks source link

reproduce the result of UKDALE based on seq2point. #2

Open Hessen525 opened 4 years ago

Hessen525 commented 4 years ago

Hi,Dr. @MingjunZhong Thanks for sharing the code. This time I am trying to reproduce result of ukdale based seq2point. But the results I get for ukdale are quite different from your reported results. I'm wondering if I missed something, anyone can advise? There are two version of UKDALE, one was published in 2015 and another was in 2017, the only different is ukdale(2017) have more datas in building 1 . As your paper, 'ukdale: All the readings were recorded in every 6 seconds from November 2012 to January 2015, the buildings 1 and 2 from UK-DALE were used for our experiments since the data in other buildings are small.' so, I try ukdale-2015 version and set parameters as following: params_appliance = { 'kettle': { 'windowlength': 599, 'on_power_threshold': 2000, 'max_on_power': 3998, 'mean': 700, 'std': 1000, 's2s_length': 128, 'houses': [1, 3,4, 5,2], 'channels': [10, 2, 3, 18, 8], 'train_build': [1,3,4,5], 'test_build': 2, }, 'microwave': { 'windowlength': 599, 'on_power_threshold': 200, 'max_on_power': 3969, 'mean': 500, 'std': 800, 's2s_length': 128, 'houses': [1, 5, 2], 'channels': [13, 23, 15], 'train_build': [1,5], 'test_build': 2, }, 'fridge': { 'windowlength': 599, 'on_power_threshold': 50, 'max_on_power': 3323, 'mean': 200, 'std': 400, 's2s_length': 512, 'houses': [1, 4, 5, 2], 'channels': [12, 5, 19, 14], 'train_build': [1,4,5], 'test_build': 2, }, 'dishwasher': { 'windowlength': 599, 'on_power_threshold': 10, 'max_on_power': 3964, 'mean': 700, 'std': 1000, 's2s_length': 1536, 'houses': [1, 5, 2], 'channels': [6, 22, 13], 'train_build': [1,5], 'test_build': 2, }, 'washingmachine': { 'windowlength': 599, 'on_power_threshold': 20, 'max_on_power': 3999, 'mean': 400, 'std': 700, 's2s_length': 2000, 'houses': [1, 5, 2], 'channels': [5, 24, 12], 'train_build': [1,5], 'test_build': 2, } }

The result is very different with yours (earlystopping method, 100 epochs, 10 patience.), ex: for dishwasher, 41.2 VS 27.7. I'm wondering if I missed something, any help/advice will be much appreciated!

MingjunZhong commented 4 years ago

@Hessen525 If you look up the tables on README.md on this repository, you will find that house 1 only was used for training and house 2 as test data. Have you used the code provided in this repository to produce the training, validation, and test sets? Unfortunately, NILM algorithms would produce different results if the training data are different.

Hessen525 commented 4 years ago

@MingjunZhong HI, Dr. Zhong. Thanks for your quick reply. I didn't mean any offense. Yes, I use this transfernilm project.

In your transfernilm paper, A. Sequence-to-point learning part, said'For UK-DALE, the method was trained on the houses 1,3,4 and 5 and tested on the house 2' , dataset used and result is exactly same with described in your seq2point learning paper. --------different with training set in README.md but same with parameter given above because some building don't have specified appliances.

In your experimental setup part of transfernilm paper , 4) Preparing training and test data, said 'The buildings 1 and 2 from UK-DALE were used for our experiments since the data in other buildings are small.' ---------similar with your training set in README.md

that is why I guess you used different training set between seq2point learning and transfer learning. as above mentioned, ukdale described twice but different set.

Do you mean all experiments for ukdale use same training set as mentioned in README.md whether SEQ2POINT or APPLIANCE TRANSFER LEARNING FOR SEQUENCE-TO-POINT LEARNING?

MingjunZhong commented 4 years ago

@Hessen525 Yes I think it would be more consistent following the README.md in this repository - using buildings 1&2 for UK-DALE.

Hessen525 commented 4 years ago

@MingjunZhong Hi, Dr. Zhong, I am trying this project based on ukdale-2015, but I still can not reproduce results as good as yours,I set epochs=100 and patience=10 in earlystopping for better result. I am wondering if I missed anything or I used a different version of ukdale ? (there are two version ukdale after clicking your link for ukdale)

ukdale-2017: https://data.ukedc.rl.ac.uk/browse/edc/efficiency/residential/EnergyConsumption/Domestic/UK-DALE-2017/UK-DALE-FULL-disaggregated

ukdale-2015:https://data.ukedc.rl.ac.uk/browse/edc/efficiency/residential/EnergyConsumption/Domestic/UK-DALE-2015/UK-DALE-disaggregated

MingjunZhong commented 4 years ago

@Hessen525 I think the Transfer Learning paper was using 2017 data, but for the UK-DALE data (training and testing both for UK-DALE) the results were taken from sequence-to-point learning 2018 paper and there it was using 2015 data.

Txiaoxiao commented 4 years ago

@Hessen525 Hey, I have gotten the same result as yours, bigger than 40 for dishwasher vs 27.7 in the paper, have you ever reproduce the results of the paper?

hellowangqian commented 4 years ago

Hi @Txiaoxiao I've been able to get even lower MAE on UK-dale for dishwasher . mean (std) of MAE over 10 trials: kettle: 10.9 (1.93) microwave: 9.05 (2.73) fridge: 23.21 (1.06) dishwasher: 18.25 (5.17) washingmachine: 9.4 (0.49) Note that I set window length of input as 199 rather than 599 and modified the model architecture a bit by replacing 2D conv with 1D conv (similar to this implementation: https://github.com/OdysseasKr/neural-disaggregator/blob/master/ShortSeq2Point/shortseq2pointdisaggregator.py).

MingjunZhong commented 4 years ago

@hellowangqian @Txiaoxiao @Hessen525 : It's interesting to see these results. These hyperparameters, i.e., window length, model architecture including number of layers & types of NN, affect the results significantly (for example washing machine from 27.7 to 18.25). So each appliance may have to use independent parameters, but then the problems is that it would be tricky to transfer from appliance to appliance.

Hessen525 commented 4 years ago

@MingjunZhong Hi, Dr. Zhong I guess I found the answer to why we can't reproduce the results. In create_trainset_ukdale.py, you set training_building_percent = 95(line 41), and then drop the last 95% data in that house(Line 122). It seems different from the training data in your paper.

Txiaoxiao commented 4 years ago

Hi @Txiaoxiao I've been able to get even lower MAE on UK-dale for dishwasher . mean (std) of MAE over 10 trials: kettle: 10.9 (1.93) microwave: 9.05 (2.73) fridge: 23.21 (1.06) dishwasher: 18.25 (5.17) washingmachine: 9.4 (0.49) Note that I set window length of input as 199 rather than 599 and modified the model architecture a bit by replacing 2D conv with 1D conv (similar to this implementation: https://github.com/OdysseasKr/neural-disaggregator/blob/master/ShortSeq2Point/shortseq2pointdisaggregator.py).

wow, th results are excellent, that helps me a lot on my research. thanks!

ZhuHouYi commented 2 months ago

Hello, I would like to ask you to use house1 training and house2 testing when dealing with ukdale data sets. In the 121lines of your dataset_management/ukdale/create_trainset_ukdale.py file, I found that you removed 95% of the data, and in my experiment, the house1 kettle of the training set left only the last 5% of the data (about 1.6 million items). I want to know whether the fine-tuning can be applied to the eliminated 95% data, so that the model can better learn the use of appliances in house1. Because the model has not seen the data in house2, the transfer learning result may not be good. Thank you

MingjunZhong commented 2 months ago

Hi, I wouldn't fine-tuning on the same house. It might be good to train the model using all the data from the same house if you have plenty of memory.