peterdsharpe / AeroSandbox

Aircraft design optimization made fast through modern automatic differentiation. Composable analysis tools for aerodynamics, propulsion, structures, trajectory design, and much more.
https://peterdsharpe.github.io/AeroSandbox/
MIT License
690 stars 111 forks source link

Unsteady aero library #47

Closed peterdsharpe closed 3 years ago

peterdsharpe commented 3 years ago

Hey @antoniosgeme ! This unsteady aero library looks really solid, awesome work!

I'm creating this PR just as a bookmark to remind us to merge this at some point down the line when you feel it's ready. No rush at all, and thanks for the great contributions!

-Peter

antoniosgeme commented 3 years ago

@peterdsharpe Thanks! I think after implementing some more unsteady analytical models, I will try to solve an optimal control (or trajectory optimization?) problem through some known disturbance. Let me know if you have any ideas/feedback! -Antonios

peterdsharpe commented 3 years ago

Sounds perfect! There are a few optimal control examples in AeroSandbox here if they're helpful; I'll spend a bit of time tonight elaborating on these and adding some aircraft examples!

(My advisor's been poking me to do this for my thesis work anyway, so it's really no trouble haha.)

antoniosgeme commented 3 years ago

Hey @peterdsharpe , I was wondering, is there a natural way to automatically differentiate functions within aerosandbox? Or should we resort to external packages? Eg:

def foo(x)
  y = np.sin(x)
  return 2*y**2 + 5

derivative(foo,5)
peterdsharpe commented 3 years ago

Hi @antoniosgeme !

Depending on the case, one can use the gradient function from CasADi, the AD package underneath AeroSandbox! Basic example here; let me know if this makes sense:

import aerosandbox as asb
import aerosandbox.numpy as np
import casadi as cas

opti = asb.Opti()

x = opti.variable(init_guess=1)

f = x ** 2

dfdx = cas.gradient(f, x)

opti.subject_to(
    dfdx == -1
)

opti.minimize(0)  # No objective

sol = opti.solve()

assert sol.value(x) == -0.5
peterdsharpe commented 3 years ago

I'm not positive that CasADi has an operator space that is a closed set under differentiation, so there might be issues taking derivatives of arbitrarily high order with complicated functions, but this should get you through most things!

For derivatives of vector functions, there's also cas.jacobian(), but the syntax here can be a bit finnicky.

peterdsharpe commented 3 years ago

Hmmm - an interesting issue! So, it looks like you can only naively differentiate using cas.gradient if you disable AeroSandbox's auto-scaling heuristics on variables by flagging opti.variable(..., scale=1) when you create the variable. I'll have to investigate this further!

antoniosgeme commented 3 years ago

Thanks Peter! I think this does the job.

import aerosandbox as asb
import aerosandbox.numpy as np
import casadi as cas

opti = asb.Opti()

x = opti.variable(init_guess=1,scale=1)

def foo(x):
    y = np.sin(x) 
    f = y**2
    return f 

f = foo(x)

dfdx = cas.Function("dfdx", [x] , [cas.gradient(f, x)])

assert dfdx(5) == 2*np.sin(5)*np.cos(5)

and more generally for vector valued functions

opti = asb.Opti()

n_vars = 10

x = opti.variable(init_guess=np.ones(n_vars),scale=1)

def foo(x):
    y = np.sin(x) 
    f = y**2
    return f 

f = foo(x)

dfdx = cas.Function("dfdx", [x] , [cas.jacobian(f, x)])

I haven't noticed the scaling issue but I'll take your word for it.