Closed zhangdahua1 closed 2 years ago
Thanks for your question. We use the time granularity of 1 hour. In each of the csv file, there are 24 data points, representing the load profile for one single day (24 hours). With the 73 days of load profile data, we think this is more than enough for training and testing. After all, the load profile only describes p(s'| s, a) in the Bellman equation. Please let us know whether it solves your question.
My question has been answered.Thank you very much for your reply.
Sorry, I have some problems again. I noticed that you wrote 'the load profiles are randomly partitioned into two haves, one for training and the other for testing.'For example, there are only 24 data in powergym master \ systems \ 13bus \ loadshape \ 000 \ 611.csv. Even if there are 73 folders like this, the amount of data is not enough. So I'd like to ask how your training data is composed? What the time scale about the load data? Thank you very much for your reply.