Open dtonevsurf opened 6 years ago
Here's a small example using the hessian
function.
from autograd import numpy as np
from autograd import grad
from autograd import hessian
# an example function
g = lambda w: 5*w[0]**2 + 7*w[1]**2
# define a test input
w = np.array([1.0,1.0])
# compute gradient function, then evaluate
nabla_g = grad(g)
print (nabla_g(w))
# compute hessian (second derivative matrix), then evaluate
hess_g = hessian(g)
print (hess_g(w))
The thing is that using hessian
computes the full Hessian; would it be possible to only compute the diagonal of the Hessian? I know we can extract from the output of hessian
but this entails unnecessary computation.
EDIT: THIS PROPOSED SOLUTION DOES NOT WORK IN GENERAL - see dougalm's answer in #445 for explanation
With only six variables it might not be too big of a deal. Nonetheless, here's an updated example version of the example above that includes a computation of only the diagonal of the Hessian (the pure second derivatives). In short - use elementwise_grad
twice.
from autograd import numpy as np
from autograd import hessian
from autograd import elementwise_grad
# an example function
g = lambda w: 5*w[0]**2 + 7*w[1]**2
# define a test point
w = np.array([2.0,2.0])
# compute gradient function, then evaluate
nabla_g = elementwise_grad(g)
print ('gradient = ' + str(nabla_g(w)))
# compute hessian (second derivative matrix), then evaluate
hess_g = hessian(g)
print ('hessian at test point = ' + str(hess_g(w)))
# NEW: just the pure second derivatives - the diagonal of the hessian
diag_hess_g = elementwise_grad(nabla_g)
print ('diagonal of hessian at test point = ' + str(diag_hess_g(w)))
PS: you can check out the entire list of built in differential operators here.
Hello Everyone,
Does anyone know how to use autograd to calculate the second derivative of a multivariate function?
Any example will be highly appreciated.
The second derivative value is not reconciling with a closed form solution. The first derivatives are spot on. Probably, I am misusing autograd and I would like to ask for the correct way of setting the calculation up. Any help will be appreciated.
Unexpected results
Below is the line under question, while, the complete scrip is further down. from autograd import elementwise_grad as egrad greeks_nd = egrad(egrad(_local_price))(args) More specifically I am interested in this value egrad(egrad(_local_price))(args)[0]
BACKGROUND INFO
I am trying to calculated the Greeks of Black Scholes European Options formula. The option price formula is differentiated by hand and the result from it is compared to autograd. The second derivative "Gamma" w.r.t. price is what I could not get correctly from autograd.
For completeness the Balck Scholes formula can be seen here: https://en.wikipedia.org/wiki/Black%E2%80%93Scholes_model
The pricing formula is called: def black76_call(x, k, r, v, t_exps): The second derivative of the pricing formula is derived by hand and is called: def black76_call_gamma(x, k, r, v, t_exps):
The result is then compared to egrad(egrad(_local_price))(args)[0]
SCRIPT
from autograd import grad import autograd.numpy as np from autograd.scipy.stats import norm from autograd.scipy.special import erfc from collections import OrderedDict
NEGINVSQR2 = -0.7071067811865475
N = norm.cdf pdf = norm.pdf
def black76_call(x, k, r, v, t_exps): """Black-76 call price
def black76_call_delta(x, k, r, v, t_exps): df = np.exp(- r t_exps) d1 = (np.log(x / k) + np.power(v, 2) / 2. t_exps) / (v np.sqrt(t_exps)) return df N(d1)
def black76_call_gamma(x, k, r, v, t_exps): vst = v np.sqrt(t_exps) df = np.exp(- r t_exps) d1 = (np.log(x / k) + np.power(v, 2) / 2. t_exps) / (v np.sqrt(t_exps)) return (df pdf(d1)) / (x vst)
def black76_call_vega(x, k, r, v, t_exps): df = np.exp(- r t_exps) d1 = (np.log(x / k) + np.power(v, 2) / 2. t_exps) / (v np.sqrt(t_exps)) return df x pdf(d1) np.sqrt(t_exps)
def black_ad_greeks(args):