EnergyInnovation / eps-us

Energy Policy Simulator - United States
GNU General Public License v3.0
22 stars 7 forks source link

Model Electricity Balancing Areas #219

Closed jrissman closed 1 year ago

jrissman commented 2 years ago

Currently, the EPS has a single "area" for all electricity production and demand within a given EPS model. Constraints on electricity flows are handled via a "transmission connectivity coefficient" and a system of "flexibility points," where some resources generate flexibility points (demand response, energy storage, etc.) and other resources require flexibility points to operate without curtailment (wind, solar). Cross-regional flows are only used for imports and exports of electricity from the modeled area and are taken in as input data rather than calculated endogenously.

We could improve the representation of the electricity sector, to better incorporate a true grid expansion model into the EPS, by introducing a new subscript called Balancing Area, which refers to a large area with good transmission connectivity within the area, but limited capacity to import and export capacity from the area. The model calculates what to build within each balancing area individually. Each balancing area has its own unique capacity factors (for instance, one area might be windier than another area), but most cost and performance inputs would be the same for each area (i.e., a wind turbine costs the same to build in any balancing area). Each balancing area has its own market price for electricity (in each hour) based on the highest-cost plant type that is dispatched in that hour.

Each balancing area has a certain transmission capacity linking it to adjacent balancing areas. Trade between areas is calculated via several passes:

  1. If an area has lower-cost electricity than a connected area (in a given hour), it exports electricity to the connected area at the full capacity of its transmission connection
  2. Then dispatch in all areas is recalculated given altered demand levels from electricity trade in the previous step
  3. If the low-cost area is still lower-cost, then we are done
  4. If the low-cost area is now higher cost, we sent too much electricity in step 1. So we back off the transferred amount in step 1 by half and then proceed to step 2.
  5. Repeat steps 1 through 4, adjusting electricity flows by half as much in each step, until prices stop changing (because either transmission capacity between neighboring areas is maxed out, or the regions' prices have become equal).

Rather than test for whether prices have become equal in step 5, we could always do a certain number of passes (say, 10) that we feel is enough passes to reliably get a final answer. This would simplify the code considerably.

Multiple passes can be made within a given timestep via a special subscript such as "Balancing Pass" and special code to use the results of earlier elements of this subscript to influence later elements of the subscript. It can be facilitated by mapping a subscript to itself with an off-by-one increment.

For regions that lack good data on multiple balancing areas, all of the generation and demand could be allocated to "Balancing Area 1" and zeroes left for all other balancing areas, so it effectively continues to function like a one-region model. We would need to retain some concept of flexibility points and/or transmission connectivity coefficient, which would be input separately for each balancing area, and which would refer to these constraints within the borders of that balancing area. For models with a single big balancing area, these constraints would be similar to how they operate today governing the whole electricity sector.

For the U.S. nationally, there are a number of ways to divide up the grid into balancing areas. There are technically 74 balancing areas in the three big "interconnection" regions that cover the continental U.S., of which 66 are in the U.S., 7 in Canada, and 1 in Mexico. If good data exist for the 66 balancing authorities, we could possibly use that. But a better alternative might be to use the 25 areas designated in the EIA's Electricity Market Module for their model, because (1) this would facilitate using output from the EIA's model as inputs to the EPS, (2) it might facilitate comparison between our results and EIA results, (3) it adds credibility to use the same balancing areas in our model as EIA uses in their model, and (4) 25 balancing areas is probably enough for good resolution. (The main reason to use 66 areas would be if there are not good data for the 25 areas and there are good data for the 66 areas.)

EIA Electricity Market Module Regions

Neither the 25-area nor the 66-area breakdowns include non-continental parts of the U.S. such as Alaska, Hawaii, and Puerto Rico. We might need to designate those as additional balancing areas with no electricity connections to any other balancing areas.

Note that this feature should be considered in conjunction with a move to dispatch electricity in each hour of the year (8760 hours, perhaps represented via day (1-365) and hour (1-24) subscripts) rather than an annual total quantity dispatched, because it isn't realistically possible to model these inter-area electricity trade flows and understand what is dispatched without hourly resolution. Some connections between balancing areas involve electricity flowing in one direction in one hour and in the opposite direction in another hour.

This would add to the total number of calculations Vensim must do per timestep, but I think it could be kept to a manageable runtime. If there are 8760 hours, 25 balancing areas, and 10 price-finding passes, that is an additional 2.19 million computations per timestep (where a single computation might be an ALLOCATE AVAILABLE pass, plus a little bit of other math in surrounding variables). That's probably still doable while having fast, user-friendly responsiveness, particularly in the web app, which will see an overall computation speed increase of over 10x once the current web app modernization effort is done. But it would be worth testing just to ensure it really can be done fast enough for user-friendly responsiveness. Most true grid capacity expansion and dispatch models don't run that fast, but most such models use old code and are probably very poorly optimized compared to what we are able to achieve with the cutting-edge technology we're using in the EPS in the new web app. So we'll have to test it and see how fast we can get it to run.

jrissman commented 2 years ago

Note that we should probably move to hourly dispatch first, and once that is working, then introduce balancing areas.

robbieorvis commented 2 years ago

One other interesting thing here… I am no expert in linear programming but I gather it’s how capacity expansion is done, and Vensim has an addon developed by Tom Fiddaman for linear programming and solving: https://vensim.com/wp-content/uploads/2013/03/VenextLP-README.pdf

Might be worth a look, even if we don’t use it.

jrissman commented 2 years ago

Oh, that looks really promising. Definitely worth checking out. The many-to-many solver seems on point for trade flows from 25 balancing areas to 25 balancing areas. It might be faster than doing it ourselves using ALLOCATE AVAILABLE and 10 or so passes. It would be a little bit of a learning curve to define the problem as a linear program, but maybe worth it.

The fact that it is an implementation of an existing and respected open-source library might make it easier to implement in SDEverywhere also, as SDE could simply borrow the function definitions from the open-source library rather than reverse engineering what Vensim's functions do.

jrissman commented 2 years ago

Another resource is this document describing how EIA does this: https://www.eia.gov/outlooks/aeo/nems/documentation/electricity/pdf/m068(2020).pdf

jrissman commented 2 years ago

For speed testing purposes, once we re-introduce ALLOCATE AVAILABLE (or use a linear programming solver, or both), I could artificially tell Vensim to run the allocation calculation X times per timestep. It would be the exact same allocation calculation every time, coming out with the same answer, so the only purpose would be to see how many times we can run the allocation operation per timestep while still keeping model runtime low. That might give us a sense of the threshold we have to keep under.

We should remember that the EPS will keep growing and getting more complex in the years ahead, so we don’t want to calibrate this one feature to use up all of the computing time we feel we can spare, because that would leave nothing for future features we might build.

Ways to cut down on total number of allocations are:

Note that doing just a few representative days would not be looking at the full amount of electricity dispatched in the year, so we wouldn’t be able to take the sum across all hours to get the total. We need annual totals to feed into the rest of the model. So we’d have to assume that every weekday in spring is an exact copy of the representative spring weekday, every weekend day in Spring is a copy of the representative Spring weekend day, etc.

Maybe that is what is meant by a “representative day.” But this requires the “representative day” be the average day of its kind, not the worst day of its kind. So it might limit our findings about grid resilience on the most difficult days of the year.

Alternatively, we could try modeling both “representative day” and “worst day” of each season, or something like that, and only use the “representative days” when computing annual totals.

robbieorvis commented 2 years ago

ReEDS (NREL's capacity expansion model) has good documentation on how they handle this here (starting on p.13): https://www.nrel.gov/docs/fy21osti/78195.pdf

In short: they have 17 time slices, which includes 4 seasons with 4 parts of the day and one superpeak which represents the highest demand hours of the year. They acknowledge the limitations of this approach for dispatch and have additional customizations to try and address those limitations (discussed throughout).

robbieorvis commented 2 years ago

GenX is another open source capacity expansion model that is very well documented, if you are curious to see other approaches here: https://genxproject.github.io/GenX/dev/

robbieorvis commented 2 years ago

Note this issue is tied closely to issue #106

jrissman commented 1 year ago

We've decided against doing this. It would be prohibitively slow to calculate dispatch for every hour, for every region, potentially many times to home in on inter-region flows. Also, we believe it is not necessary for a good representation of the power sector.

robbieorvis commented 1 year ago

I had also been thinking about using this for the capacity expansion piece, since that is what typically uses a LP solver.

From: Jeff Rissman @.> Sent: Wednesday, March 23, 2022 3:23 PM To: Energy-Innovation/eps-us @.> Cc: Robbie Orvis @.>; Comment @.> Subject: Re: [Energy-Innovation/eps-us] Model Electricity Balancing Areas (Issue #219)

Oh, that looks really promising. Definitely worth checking out. The many-to-many solver seems on point for trade flows from 25 balancing areas to 25 balancing areas. It might be faster than doing it ourselves using ALLOCATE AVAILABLE and 10 or so passes. But it would be a little bit of a learning curve to define the problem as a linear program, but maybe worth it.

The fact that it is an implementation of an existing and respected open-source library might make it easier to implement in SDEverywhere also, as SDE could simply borrow the function definitions from the open-source library rather than reverse engineering what Vensim's functions do.

— Reply to this email directly, view it on GitHubhttps://github.com/Energy-Innovation/eps-us/issues/219#issuecomment-1076733860, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AK5N6SNR3UT7UHJXIKN7IEDVBNVSPANCNFSM5RO2XL6Q. You are receiving this because you commented.Message ID: @.**@.>>

jrissman commented 1 year ago

We've just developed a very nice approach to deciding which plants to build in #232, which includes checks for what to build based on meeting reliability requirements, base on an RPS, and based on economic favorability. All three involve logit functions, which have some key advantages: they are straightforward to tune, and can be tuned independently for each of the three mechanisms to help calibrate and achieve realistic behavior. Also, logit functions are fast-running because they accomplish the allocation in a single pass.

I don't think this is the time to be considering a whole different way of doing capacity expansion. And I don't think we should consider anything that would require an external library to hook into Vensim because it would make the model too difficult to distribute and use for our partners and the public, and it would make the model platform-sensitive (i.e., different solutions for Mac, Windows) whereas now we can just target Vensim as our platform. So I think we should leave this LP idea closed for now and focus on trying to get the new capacity expansion mechanisms being built in issue #232 working well. It isn't done yet and needs a little more structural work and some more data work, and then it will hopefully be producing results we like. We can then try to optimize run speed.