Closed sannant closed 5 years ago
=================================== FAILURES =================================== _________________ test_directional_derivative[2-Trigonometric] _________________ Objective = <class 'tests.minimization_problems.Trigonometric'>, n = 2 @pytest.mark.parametrize("Objective",[mp.Trigonometric,mp.Extended_Rosenbrock]) @pytest.mark.parametrize("n",[2,4,10]) def test_directional_derivative(Objective,n) : """ Asserts the nmerical and the Gradient provided by Objective are close :param Objective: which function to minimize :param n: number of dimensions :return: """ for i in range(12): x= Objective.bounds[0] + (Objective.bounds[1] - Objective.bounds[0]) * np.random.random(n) x.shape = (-1,1) u=np.random.normal(size=n) # Direction u /= np.linalg.norm(u,2) # normalize u.shape = (-1,1) # TODO: Don't know which Tolerance to associate with which eps eps = 1e-5 print((Objective.f(x + u * eps) - Objective.f(x)) / eps) der_numerical = (Objective.f(x + u * eps) - Objective.f(x)) / eps der_analytical = (Objective.grad(x).T@u).item() > assert abs(der_numerical- der_analytical)/der_analytical < 1e-3, "(der_numerical- der_analytical)/der_analytical = {}".form at(abs(der_numerical- der_analytical)/der_analytical) E AssertionError: (der_numerical- der_analytical)/der_analytical = 0.0012363472737317459 E assert (0.00011489677930499875 / 0.09293244846830007) < 0.001 E + where 0.00011489677930499875 = abs((0.09281755168899507 - 0.09293244846830007))
I think I should test how the error scales with epsilon rather then comparing this with a constant value.
I think I should test how the error scales with epsilon rather then comparing this with a constant value.