m3h / loadshedding

Automatic hibernate during loadshedding
GNU General Public License v3.0
2 stars 2 forks source link

Support Cape Town lower stage #6

Closed knoffelcut closed 2 years ago

knoffelcut commented 3 years ago

Commonly the areas under the City of Cape Town operates one stage lower than the rest of the country, especially before 19:00. We either need an additional check for this situation, or if not possible, allow the user to specify this behavior based on a few simple rules, e.g.: if (current_time after 07:00) and (current_time before 19:00) then true_stage = stage - 1

m3h commented 3 years ago

I'm not sure how consistent CT is with being one stage lower. I would think a more permanent fix would be to use a different API for those areas.

I think, for e.g., scraping https://www.capetown.gov.za/Family%20and%20home/Residential-utility-services/Residential-electricity-services/Load-shedding-and-outages might be a better long-term solution.

m3h commented 3 years ago

Following on to that - I don't think adding scraping to this 120-line script to support CT is a good idea. I would much rather spin that off to a separate project - especially since I've been thinking for a while that we need some sort of proxy to access the Eskom API currently hard-coded in the script.

Maybe both of those tasks can be offloaded to an AWS free-tier lambda, which can then be uses as the API endpoint in the loadshedding script here.

knoffelcut commented 2 years ago

Regarding the scraping, it seems there are at least some historic data available in the wayback machine: https://web.archive.org/web/*/https://www.capetown.gov.za/Family%20and%20home/Residential-utility-services/Residential-electricity-services/Load-shedding-and-outages

I think offloading it to AWS free-tier lambda function, and as a separate project, is a great idea. I think it is best to first write the manual front-end where CPT loadshedding times and stages can be set, since I expect the scraping accuracy to be quite poor at first.

m3h commented 2 years ago

https://ewn.co.za/assets/loadshedding/api/status

This is interesting - API for CoT stage. This is the best I could find so far, but the results are in complicated English... this could be a problem

m3h commented 2 years ago

An example of the output:


$ curl https://ewn.co.za/assets/loadshedding/api/status
ESKOM LOAD-SHEDDING: 7-13 NOV
ESKOM CUSTOMERS ON STAGE 2 UNTIL 05:00 ON SATURDAY, 13 NOVEMBER. CITY-SUPPLIED CUSTOMERS ON STAGE 1 FROM 06:00 UNTIL 22:00 ON SUNDAY, THEN STAGE 2 UNTIL 05:00. CHECK THE SCHEDULE FOR TIMES AND AREAS AFFECTED, AND
BE PREPARED FOR OUTAGES. $``` 
knoffelcut commented 2 years ago

Yes, both websites are in "complicated English", and I'm concerned about the consistency of their wording. As a start I'll just save the curl output of both sites on an hourly basis or so, thereafter we can look at parsing it.

m3h commented 2 years ago

IMO, great minds think alike: http://les-m3h.duckdns.org:7217/

m3h commented 2 years ago

I've started something regarding a CoT scraping API: https://github.com/m3h/loadshedding-stage-api

Maybe try it out if you want, and let me know if you think it's worth it (the CoT scraping is extremely fragile, and I'm basically thinking of having error alerts and smooth change deployments to account for that... oof).

You can try it out here: https://8yt84sq5d4.execute-api.af-south-1.amazonaws.com/default/loadshedding

m3h commented 2 years ago

This should now be possible by reverse engineering the CoCT app

knoffelcut commented 2 years ago

This is addressed by loadshedding-thingamabob, which uses the URL of the CoCT app to create a shedule of forthcoming stage changes.

It is supported by this project through https://github.com/m3h/loadshedding/pull/17, but as stated in that PR, it uses an AWS backend and the URL to that backend is not public.

In my view, we can close this issue. I will gladly share the AWS backend URL with you (even though it won't have any value to you). The true issue is on loadshedding-thingamabob's side, where the URL's should be made public (which won't happen until I have implemented rate limiting and figured out the expected cost, if any)