therealcyberlord / coronavirus_visualization_and_prediction

This repository tracks the spread of the novel coronavirus, also known as SARS-CoV-2. It is a contagious respiratory virus that first started in Wuhan in December 2019. On 2/11/2020, the disease is officially named COVID-19 by the World Health Organization.
https://www.kaggle.com/therealcyberlord/coronavirus-covid-19-visualization-prediction
76 stars 62 forks source link

DataConversionWarning: A column-vector y was passed when a 1d array was expected. Please change the shape of y to (n_samples, ), for example using ravel(). y = column_or_1d(y, warn=True) #2

Closed MilSix closed 4 years ago

MilSix commented 4 years ago

Not sure how, but I got DataConversionWarning at this the following codes cell: (1)

svm_confirmed = svm_search.bestestimator

svm_confirmed = SVR(shrinking=True, kernel='poly',gamma=0.01, epsilon=1,degree=3, C=0.1) svm_confirmed.fit(X_train_confirmed, y_train_confirmed) svm_pred = svm_confirmed.predict(future_forcast)

output(1) /home/jupyterlab/conda/envs/python/lib/python3.6/site-packages/sklearn/utils/validation.py:761: DataConversionWarning: A column-vector y was passed when a 1d array was expected. Please change the shape of y to (n_samples, ), for example using ravel(). y = column_or_1d(y, warn=True) (2):

bayesian ridge polynomial regression

tol = [1e-6, 1e-5, 1e-4, 1e-3, 1e-2] alpha_1 = [1e-7, 1e-6, 1e-5, 1e-4, 1e-3] alpha_2 = [1e-7, 1e-6, 1e-5, 1e-4, 1e-3] lambda_1 = [1e-7, 1e-6, 1e-5, 1e-4, 1e-3] lambda_2 = [1e-7, 1e-6, 1e-5, 1e-4, 1e-3] normalize = [True, False]

bayesian_grid = {'tol': tol, 'alpha_1': alpha_1, 'alpha_2' : alpha_2, 'lambda_1': lambda_1, 'lambda_2' : lambda_2, 'normalize' : normalize}

bayesian = BayesianRidge(fit_intercept=False) bayesian_search = RandomizedSearchCV(bayesian, bayesian_grid, scoring='neg_mean_squared_error', cv=3, return_train_score=True, n_jobs=-1, n_iter=40, verbose=1) bayesian_search.fit(bayesian_poly_X_train_confirmed, y_train_confirmed)

MilSix commented 4 years ago

clicked the wrong button. On bayesian ridge polynomial regression I got seems like infinite loop of warning(1). After I got about (x_train) warning it did display the following output: RandomizedSearchCV(cv=3, error_score='raise-deprecating', estimator=BayesianRidge(alpha_1=1e-06, alpha_2=1e-06, compute_score=False, copy_X=True, fit_intercept=False, lambda_1=1e-06, lambda_2=1e-06, n_iter=300, normalize=False, tol=0.001, verbose=False), fit_params=None, iid='warn', n_iter=40, n_jobs=-1, param_distributions={'tol': [1e-06, 1e-05, 0.0001, 0.001, 0.01], 'alpha_1': [1e-07, 1e-06, 1e-05, 0.0001, 0.001], 'alpha_2': [1e-07, 1e-06, 1e-05, 0.0001, 0.001], 'lambda_1': [1e-07, 1e-06, 1e-05, 0.0001, 0.001], 'lambda_2': [1e-07, 1e-06, 1e-05, 0.0001, 0.001], 'normalize': [True, False]}, pre_dispatch='2*n_jobs', random_state=None, refit=True, return_train_score=True, scoring='neg_mean_squared_error', verbose=1)

therealcyberlord commented 4 years ago

Thanks for your feedback. Does the code work beside the warnings?

MilSix commented 4 years ago

yes it does. Took me a long while to skip through the warning.... 1st I thought I'll be in for infinite loop, luckily it did stop warning I guess when it reached the end of ast value of x_test.

therealcyberlord commented 4 years ago

Insert this snippet of code to ignore warnings

import warnings warnings.filterwarnings("ignore")

Random grid search cv essentially tries random combinations of different hyperparameters, which explains the loop of warning messages you see.

abramosrotex commented 1 year ago

Insert this snippet of code to ignore warnings

import warnings warnings.filterwarnings("ignore")

Random grid search cv essentially tries random combinations of different hyperparameters, which explains the loop of warning messages you see.

Thanks very much for this, it was very useful for me, as I have been looking for ways to remove the warnings.