In class/at lecture we used np.logspace(-4,4,50) (or 33 as last input) to specify the range of hyperparameters to search.
This is quite broad, and would then give us a hyperparameter that minimizes the MSE. In class ABN mentioned that we could then create a new range using np.logspace around the hyperparameter that gets returned from the first minimization problem.
My question is as follows:
Are we expected to go through this process and thereby narrow down the search for the best hyperparameter of them all?
If we were to do it, would we just set the original range to the new-found range (and comment that we found the new range via. iteration), or would we be expected copy all of the code and insert the new range in that code, so the code that gave the new range still would be in the notebook?
In class/at lecture we used np.logspace(-4,4,50) (or 33 as last input) to specify the range of hyperparameters to search.
This is quite broad, and would then give us a hyperparameter that minimizes the MSE. In class ABN mentioned that we could then create a new range using np.logspace around the hyperparameter that gets returned from the first minimization problem.
My question is as follows:
Are we expected to go through this process and thereby narrow down the search for the best hyperparameter of them all?
If we were to do it, would we just set the original range to the new-found range (and comment that we found the new range via. iteration), or would we be expected copy all of the code and insert the new range in that code, so the code that gave the new range still would be in the notebook?
Thanks in advance.