Closed NLaws closed 3 years ago
We might want to consider a structural constraint, say, that the average state-of-charge must be 50% of capacity or greater. Incentivizing storage levels via the objective function might require parameter tuning, i.e., the incentive on a per-kWh basis might yield results for some cases but not others, and bad incentive values might yield unnecessary (or rather, sub-optimal) battery purchases in some cases. If this seems reasonable, I can implement this.
@zolanaj We have implemented this in the past with (in pseudo code):
dvMeanSOC = sum(ts in TimeStep) dvStoredEnergy(ts) / TimeStepCount
...
objective = min(LCC - dvMeanSOC)
which has worked well and not impacted the storage system purchase decision variables.
Dividing by TimeStepCount
keeps the value small compared to the LCC. If you divide by dvStorageKWH
then the problem becomes non linear but maybe you have something else in mind.
Yes please implement if you have time!
The constraint would look something like the following (in pseudocode that probably breaks mosel and julia):
@constraint( sum(ts in TimeStep) dvStoredEnergy(ts) / TimeStepCount == 0.5 * dvStorageSizeKWH )
So, it would be a linear constraint (and we'd just substitute out dvMeanSOC from the above). My understanding is that when using the battery is used for load balancing purposes it can be beneficial to keep it at 50% so there's flexibility for both charging and discharging the battery. Dispatch might be markedly different for load balancing, peak shaving, and utilizing renewables when resources are greater than load, but the battery system should be able to handle this constraint in every case.
I think we'll want to implememt analogous constraints for the thermal storage system. Would you agree?
I worry that adding this equality constraint will make the problem more difficult to solve than the old method, but we can try it and test the impact on scenarios with storage in the solution (I think that test_complex_incentives
has PV and storage in the solution).
That's a fair concern; I'll try both and see how solution times are impacted. I'll report the run times in the PR that follows.
@zolanaj did you ever time to look into this? if not I think that we should implement the original method. (it is in place here, but it needs a p.TimeStepScaling
factor).
Let me take a look in the next day or so, and report back. Sorry for the delay!
@NLaws After some investigation, enforcing a storage minimum doesn't impact solution times or optimal obj values much when electric storage is the only technology, but can noticeably increase costs when thermal storage is present due to the decay factor. So, I don't think the constraint I proposed earlier is a good fit. I'm thinking we might be best suited making this a user-specific option that toggles between what's currently "Obj=1" and "Obj=2", and incentivize electric storage, i.e., Obj=2 if the option is turned on (perhaps that's the default as well). If that works, I'll implement that. I think keeping the incentive for electric only is best, again because there's no decay factor (and it doesn't change much from what's already implemented). Would you agree?
@zolanaj @NLaws We plan to soon change the thermal decay/loss term for TES to be based on the storage capacity, and not the SOC, so that shouldn't be an issue. Thanks for considering that current issue thought!
@Bill-Becker @NLaws I believe the issue of (unnecessary) added costs will remain in that case too. A decay factor on either capacity or SOC will add a cost to maintaining a SOC above zero, which we would force by a constraint in the proposed approach several posts back (i.e., a constraint forcing average SOC to be above some threshold). Further, if it's best to only use the TES to capture excess CHP heat when heat loads are already met (and that seems to be the motivation, since we don't really have peak charges on heat like we do for electricity), then it seems like a bad policy to enforce a minimum SOC; you'd want capacity to capture more heat, almost the opposite of a battery which you may want to keep at a high SOC for outages or peak demand charges.
Similarly, the decay factor and different use case are why I'm suggesting the small incentive for electrical storage only, as currently implemented when Obj==2, is best.
@zolanaj @NLaws If we model thermal decay based on energy storage capacity (i.e. the max kWht that can be stored in the TES), then whether the SOC is 100% or 0%, the TES will lose heat at the same rate and will have to make up that heat even if maintaining the SOC at 0%. In reality, there is some increased thermal loss for high SOC relative to low SOC (but it's not nearly 1:1 with SOC, so we decided to remove that), and maybe part of the point is that we don't want to arbitrarily incentivize TES to stay at a high SOC because that promotes a worse approximation of the capacity-based loss. I agree we don't want to do that.
@zolanaj is there a reason to ensure mean SOC of 50%? I would venture that this is not a type of thing we'd want an equality constraint on. From my work on degradation, you actually want the battery to rest at a low SOC generally...so our obj 5 (or obj 2) is actually not beneficial for the battery health. And I can't imagine that there's any certain average SOC that we'd want to constrain it to over the course of the year. I think in general we've used that constraint to eliminate randomness in the battery dispatch (e.g. ensure it charges right at the beginning of a low-TOU period in the day), and to trend towards a resiliency case where the battery is often charged. That said we should consider why we are implementing any of this SOC-type encouragements.
@zolanaj yes agreed that the default/hard-coded value should be obj=2
(it was that way intentionally before the reformulation). We just need to make sure the "cost" of the SOC is not included in any outputs, i.e. the JuMP.objective_value
is not the actual life cycle cost, but the RECosts
expression is.
@dylancutler (I didn't get your comment before sending my last one.) This obj=2
was never about battery health. It was implemented a while ago for "more realistic battery dispatches", i.e. avoiding the cases where (since the model has perfect foresight) the battery charges just before every peak shave event.
@nlaws @dylancutler thanks for your feedback, and copy on the removal of the incentive from the objective function. It seems to me that there is enough motivation to keep this as an option, so I'm going to make it so, with a default of adding the small incentive for state of charge (or "on"). I'll see if I can add some prose on what outcome you might expect from both (battery usually resting a "empty" vs. "full"). I'll make the default "on", adjust the test posts so that it's set to "off" and add a test that sets the incentive "on" and "off" and reports the difference in SOC, which should be small. PR to follow, likely this evening.
@NLaws @zolanaj , all sounds good. Nick, I totally get it that this was not about battery health, but I think we should consider whether this is really a more "realistic dispatch". For example, the community of homes we are analyzing in AZ with solar+storage in every home has recently changed their dispatch strategy to charge exactly before the peak pricing period...hitting 100% SOC exactly at 3pm. So, while I agree something would be good to avoid 'randomness' in our dispatch. I think we want to be careful on explaining exactly why we're choosing this high SOC encouragement as default.
Probably too much input from me :) sorry! And i think your/Alex's proposed approach is fine...I just want to make sure we're clear on our assumptions because people may want to look into these exact dispatches and understand these issues.
@dylancutler Thanks for this, and the illustrative example. I think that does make sense that a battery used for peak shaving would warrant a generally empty battery with a diurnal charge-discharge pattern (at least during high-demand times of year). I think a reasonable example in which a battery full would tend to be fully charged would be if it's used for resilience. I don't really have a strong opinion on which is the default... but that's a very quick change based on the planned implementation.
Currently the
obj
value is hardcoded to 1, which minimizes LCC. However the production API usesobj=2
, which also maximizes the average storage state-of-charge so that storage dispatches "look" more reasonable (otherwise the model will typically choose to charge the battery just-in-time for peak shaving, etc.).Changing
obj
from 1 to 2 will require updating expected values fortest_resilience_stats.py
.