uwhpsc-2016 / homework1

Homework #1
1 stars 1 forks source link

Exercise 2: Necessity of f argument in gradient_descent #34

Open rachka opened 8 years ago

rachka commented 8 years ago

I noticed that the functions gradient_step and gradient_descent do not actually utilize the function f in the bodies of the functions, they just use its derivative f '. Is there a non-arbitrary reason we include it as an argument in gradient_descent?

cswiercz commented 8 years ago

That's right.

A sufficient implementation of gradient_step and gradient_descent should not even use the function f. Leave the function argument in there, though.

There is an "improved" version of gradient descent that uses f, though, to determine the best value for the step scaling sigma. Given that we will be testing your implementations on simple examples this improved technique would be overkill.

anders34 commented 8 years ago

When it comes to documentation, when we list the types of our parameters, what exactly are the variable types of f and df. Is there a type called function?

quantheory commented 8 years ago

Yes it does, as you can see by running the type command:

>>> type(lambda x: x)
<class 'function'>

Alternatively, you could use the word "callable" to describe this sort of argument, which is what the NumPy/SciPy documentation does. (They use this word because a custom class can be "called" like a function, though that's not a feature that we're likely to use in this class.)