['ManualAlphaSelection(model, ax=None, alphas=None, cv=None, scoring=None, **kwargs)', ':module: yellowbrick.regressor.alphas', '', 'Bases: :class:`yellowbrick.regressor.alphas.AlphaSelection`', '', '', '', 'The ``AlphaSelection`` visualizer requires a "RegressorCV", that is a', 'specialized class that performs cross-validated alpha-selection on behalf', "of the model. If the regressor you wish to use doesn't have an associated", '"CV" estimator, or for some reason you would like to specify more control', 'over the alpha selection process, then you can use this manual alpha', 'selection visualizer, which is essentially a wrapper for', '``cross_val_score``, fitting a model for each alpha specified.', '', '', ':Parameters:', '', ' **model** : a Scikit-Learn regressor', '', ' Should be an instance of a regressor, and specifically one whose name', ' doesn\'t end with "CV". The regressor must support a call to', ' ``set_params(alpha=alpha)`` and be fit multiple times. If the', ' regressor name ends with "CV" a ``YellowbrickValueError`` is raised.', ' ', '', ' **ax** : matplotlib Axes, default: None', '', ' The axes to plot the figure on. If None is passed in the current axes', ' will be used (or generated if required).', ' ', '', ' **alphas** : ndarray or Series, default: np.logspace(-10, 2, 200)', '', ' An array of alphas to fit each model with', ' ', '', ' **cv** : int, cross-validation generator or an iterable, optional', '', ' Determines the cross-validation splitting strategy.', ' Possible inputs for cv are:', ' - None, to use the default 3-fold cross validation,', ' - integer, to specify the number of folds in a `(Stratified)KFold`,', ' - An object to be used as a cross-validation generator.', ' - An iterable yielding train, test splits.', ' ', ' This argument is passed to the', ' ``sklearn.model_selection.cross_val_score`` method to produce the', ' cross validated score for each alpha.', ' ', '', ' **scoring** : string, callable or None, optional, default: None', '', ' A string (see model evaluation documentation) or', ' a scorer callable object / function with signature', ' ``scorer(estimator, X, y)``.', ' ', ' This argument is passed to the', ' ``sklearn.model_selection.cross_val_score`` method to produce the', ' cross validated score for each alpha.', ' ', '', ' **kwargs** : dict', '', ' Keyword arguments that are passed to the base class and may influence', ' the visualization as defined in other Visualizers.', '', '.. rubric:: Notes', '', '', 'This class does not take advantage of estimator-specific searching and is', 'therefore less optimal and more time consuming than the regular', '"RegressorCV" estimators.', '', '.. rubric:: Examples', '', '', '>>> from yellowbrick.regressor import ManualAlphaSelection', '>>> from sklearn.linear_model import Ridge', '>>> model = ManualAlphaSelection(', "... Ridge(), cv=12, scoring='neg_mean_squared_error'", '... )', '...', '>>> model.fit(X, y)', '>>> model.poof()', '', '', '.. py:method:: ManualAlphaSelection.draw()', ' :module: yellowbrick.regressor.alphas', '', ' ', ' ', ' Draws the alphas values against their associated error in a similar', ' fashion to the AlphaSelection visualizer.', ' ', ' ', '', '.. py:method:: ManualAlphaSelection.fit(X, y, **args)', ' :module: yellowbrick.regressor.alphas', '', ' ', ' ', ' The fit method is the primary entry point for the manual alpha', ' selection visualizer. It sets the alpha param for each alpha in the', ' alphas list on the wrapped estimator, then scores the model using the', ' passed in X and y data set. Those scores are then aggregated and', ' drawn using matplotlib.']:35: ERROR: Unexpected indentation.
On
make html
insidedocs
, getting this warning: