Open rain1024 opened 10 years ago
<regression>
This model solves a regression model where the loss function is the linear least squares function and regularization is given by the l2-norm. Also known as Ridge Regression or Tikhonov regularization. This estimator has built-in support for multi-variate regression (i.e., when y is a 2d-array of shape [n_samples, n_targets]).
Coefficient : with
clf = Ridge(alpha=1.0)
clf.fit(X, y)
<classification>
<classification>
Graphical Model
Runner-ups: weka, R, octave
install scipy
install numpy
pip install -U scikit-learn
- How can we assess the expected error of a learning algorithm on a problem? That is, for example, having used a classification algorithm to train a classifier on a dataset drawn from some application, can we say with enough confidence that later on when it is used in reallife,its expected error rate will be less than, for example, 2 percent?
- Given two learning algorithms, how can we say one has less error than the other one, for a given application? The algorithms compared can be different, for example, parametric versus nonparametric, or they can use different hyperparameter settings. For example, given a multilayer perceptron with four hidden units and another one with eight hidden units, we would like to be able to say which one has less expected error. Or with the k-nearest neighbor classifier, we would like to find the best value of k
The relative importances of these factors change depending on the application.
- For example, if the training is to be done once in the factory, then training time and space complexity are not important; if adaptability during use is required, then they do become important.
- Most of the learning algorithms use 0/1 loss and take error as the single criterion to be minimized; recently, cost-sensitive learning variants of cost-sensitive learning these algorithms have also been proposed to take other cost criteria into account.
Examples
Spam detection, Image recognition, Hand written digit recognition
In
Out
2/3 Regression
Examples
Drug response, Stock prices, Predict Housing Price
In
Out
3/3 Clustering
Examples
Customer segmentation, Grouping experiment outcomes, Document tagging
In
Out
Techniques
1/3 Dimensionality reduction
Visualization, Increased efficiency
In
Out
2/3 Model selection
Improved accuracy via parameter tuning
In
Out
3/3 Preprocessing
Transforming input data such as text for use with machine learning algorithms.
In
Out
A Tour of Machine Learning Algorithms
A Tour of Machine Learning Algorithms
Source: http://scikit-learn.org/stable/_static/ml_map.png
References