NREL / floris

A controls-oriented engineering wake model.
http://nrel.github.io/floris
BSD 3-Clause "New" or "Revised" License
208 stars 155 forks source link

Returning gradients for yaw optimization #219

Open petebachant opened 3 years ago

petebachant commented 3 years ago

I see that this feature is in the roadmap. Has there been any discussion on how to implement, or maybe a WIP branch?

bayc commented 3 years ago

The discussion so far as centered around using Jax (specifically autograd within Jax), and we are currently performing a refactoring of FLORIS in which we are working to enable all the models to be Jax friendly. As for a timeline, we are hoping to have the refactoring done sometime in August, after which the gradients would shortly (hopefully) be available.

Do you have any experience with returning gradients from more complex code such as FLORIS?

petebachant commented 3 years ago

Do you have any experience with returning gradients from more complex code such as FLORIS?

I don't. Just now starting to explore available tools like algopy. I also see that @andrewning's group at BYU had modified some version of FLORIS to return gradients in a 2017 paper. Is that code floating around anywhere?

bayc commented 3 years ago

I don't have access to it directly, but can reach out and let you know what I find.

jaredthomas68 commented 3 years ago

@petebachant I'm in the FLOWLab (@andrewning's research group). The code you are referring to is still in existence, but is not currently maintained and is in some disrepair if you are still interested in it you can access it at https://github.com/byuflowlab/PlantEnergy and https://github.com/byuflowlab/gaussian-wake.

We are currently developing a julia version of floris called FLOWFarm, where the gradients can be easily obtained. You can take take a look at https://github.com/byuflowlab/FLOWFarm.jl. We have not used this code for yaw optimization very much, but it does work.

bayc commented 2 years ago

@petebachant Following up to see if you made any progress using algopy or any other solution on this front. I have tried implementing Jax, but haven't had the results I have hoped for yet.

petebachant commented 2 years ago

@bayc nope, unfortunately I haven't gotten around to it. It sounds like the serial refine optimization in 3.0 may negate the need for computing gradients for optimization. Do you guys have any benchmarking on differences in AEP gain between serial refine and gradient based optimization?

Bartdoekemeijer commented 2 years ago

@petebachant we have already used Serial Refine (SR) in multiple commercial settings back with floris v2.4 (#262). Here are some benchmark results for v2.4 where we used SR to get a warm-start and then use SciPy to refine the optimal yaw solutions (vs. using SciPy directly for the entire optimization):

Various commercial sites of ~50 turbines: Computation time: between 20% and 50% of SciPy optimization AEP uplift from wake steering: 10% higher than SciPy (e.g., from +1.0% to +1.1%)

Various commercial sites of ~100 turbines: Computation time: between 20% to 60% of SciPy AEP uplift from wake steering: 3% to 7% more than SciPy

Note that for large farms like these, even using SR without then refining the solutions with SciPy already often outperforms SciPy itself, since with these many dimensions the SciPy solver gets stuck in local minima. Also, with v3.0, all of this should really be multiple orders of magnitude faster than with v2.4.

Now back to the topic of analytical gradients: if one insists on refining the solutions with SciPy, then I still see value in analytical gradients. Namely, the SciPy solver in v3.0 will show comparable performance to the SciPy solver with v2.4 and thus can be a bottleneck for large farms.