shankarpandala / lazypredict

Lazy Predict help build a lot of basic models without much code and helps understand which models works better without any parameter tuning
MIT License
3.03k stars 344 forks source link

Taking too much time to run . #317

Open Vinitkumar89 opened 3 years ago

Vinitkumar89 commented 3 years ago

Description

I tried to run lazypredict regressor on a black friday sales train dataset n it gets stuck on 60 %-63% . dataset has 55,000 rows.

What I Did



 60%|█████████████████████████████████████████████████▌                                | 26/43 [33:11<04:11, 14.82s/it]
brendalf commented 3 years ago

Hi @Vinitkumar89 , thank you for the report.

How many features your dataset has? There is categorical features or all features are numerical?

Vinitkumar89 commented 3 years ago

Hi @brendalf.

sorry for the late reply. It has 2 categorical and 10 numerical features in it. with train data having 550068 rows and test data having 233529 rows.

brendalf commented 3 years ago

@shankarpandala, can you help me here? You know what model the lazypredict is stuck (step 26/43)? Different version of lazypredict tends to have a different number of classifiers/regressors to train. I think that a quick win here can be the progress bar showing up what model is currently running.

shankarpandala commented 3 years ago

We can see which model is running by setting verbose>1 Some models take really long time and memory to build models.

I have manually removed them from the list already but still there are some models that take long time.

My long term plan is to divide algorithms by time-complexity let users choose which complexity they want

abbasshujah commented 3 years ago

I would like to work on this I might have an idea on how to improve the speed. How do I contribute

SSMK-wq commented 2 years ago

@shankarpandala - I also face the same issue. It is stuck at 74% more than 5 hours. My dataset size is also small. It has only 5900 rows and 70 features. could 70 features be the culprit? I didn't do feature engineering/selection yet. I just passed the train and test as it is to see how the model is doing. Can help me please? Is there anyway to fix this issue? I can sponsor by paying 50 USD

jahnfirth commented 2 years ago

I've had a similar issue as described by @SSMK-wq. My LazyRegressor got stuck on 74% too and I had left it for 2h+.

My dataset is around 8000r/150c, filled with binary independent 1/0 values predicting a continuous target variable.

I use lazypredict as an initial screen and have enjoyed it's user-friendly low code workflow.

@shankarpandala It would be great if you could include a timeout = threshold parameter within the LazyRegressor() that when passed the algorithm would skip to the next model. This would save a lot of time and avoid waiting for a model which you probably wouldn't use.

Thanks a lot for all your work. Top stuff!

shankarpandala commented 2 years ago

I've had a similar issue as described by @SSMK-wq. My LazyRegressor got stuck on 74% too and I had left it for 2h+.

My dataset is around 8000r/150c, filled with binary independent 1/0 values predicting a continuous target variable.

I use lazypredict as an initial screen and have enjoyed it's user-friendly low code workflow.

@shankarpandala It would be great if you could include a timeout = threshold parameter within the LazyRegressor() that when passed the algorithm would skip to the next model. This would save a lot of time and avoid waiting for a model which you probably wouldn't use.

Thanks a lot for all your work. Top stuff!

There is already a way to skip models by specifying the algorithms. Time based skipping doesn't work with windows so I didn't implement it

shankarpandala commented 2 years ago

@shankarpandala - I also face the same issue. It is stuck at 74% more than 5 hours. My dataset size is also small. It has only 5900 rows and 70 features. could 70 features be the culprit? I didn't do feature engineering/selection yet. I just passed the train and test as it is to see how the model is doing. Can help me please? Is there anyway to fix this issue? I can sponsor by paying 50 USD

Maybe some algorithm is taking a long time to train. You can skip those algorithms that are taking time. You can specify the list of algorithms you want.

SSMK-wq commented 2 years ago

@shankarpandala - how to specify the list of algorithms that we want to try? Is there any syntax that you can share? Am not able to find anything in the documentation. Can help please?

hakkache commented 1 year ago

Hello dears,

@Vinitkumar89 maybe you are facing the same issue as me . make the parametrs Verbose =1 and ignore_warnings=False to see the warnings messages .

For my case i am using OneHotEncoder for the Categorical Data but when i am fitting the Data to LazyRegressor he show me a warning regrading the unkown categories found . (There is some categories on test dataset not available on training dataset) on OneHotEncoder there is a way to avoid the issue by making the parameter "handle_unknown="ignore" " but on lazyPredict Package i didn't found anything useful for solve this issue on Documentation .

@shankarpandala could you please help if there is anyway to avoid this issue ??

Thanks Guys for This interessting Subject .

hakkache commented 1 year ago

Hello Dears ,

i hope you're doing fine :) there is any news regarding my question ?

Thanks for your help

danielwalke commented 1 year ago

@SSMK-wq It seems that you can specify it either with a string ("all"), or with a list of classifiers (probably model classifiers from scikit) if self.classifiers == "all": self.classifiers = CLASSIFIERS else: try: temp_list = [] for classifier in self.classifiers: full_name = (classifier.name, classifier) temp_list.append(full_name) self.classifiers = temp_list except Exception as exception: print(exception) print("Invalid Classifier(s)")

dchecks commented 1 year ago

@shankarpandala - how to specify the list of algorithms that we want to try? Is there any syntax that you can share? Am not able to find anything in the documentation. Can help please?

Here's some code to only include regressors that are in the "chosen_regressors" list. The actual list of regressors is quite long, if you want them jump into the code def for the LazyRegressor class. I've just included the first two in the list for this example.

from sklearn.utils import all_estimators
from sklearn.base import RegressorMixin
chosen_regressors = [
    'SVR',
    'BaggingRegressor'
]

REGRESSORS = [
    est
    for est in all_estimators()
    if (issubclass(est[1], RegressorMixin) and (est[0] in chosen_regressors))
]

reg = LazyRegressor(verbose=1, ignore_warnings=False, custom_metric=None, regressors=REGRESSORS)

I had this issue with the 'GaussianProcessRegressor' You'll see the code that this has been adapted from here https://github.com/shankarpandala/lazypredict/blob/aad245d602f080575d05ec68750fa9c229aeea30/lazypredict/Supervised.py#L77

Unco3892 commented 1 year ago

@dchecks I have the same issue, and using your method, I specified all the models except GaussianProcessRegressor and the training worked. Thanks for posting this.

Lramos505 commented 1 year ago

I tried adding a LGBM regressor to the list of chosen regressors and it wasn't added, any ideas what I might have done wrong? image

KayO-GH commented 1 year ago

@Lramos505 According to the codebase, LGBMRegressor is already included.
Check out line 84 at https://github.com/shankarpandala/lazypredict/blob/dev/lazypredict/Supervised.py I see it in my output.

Also, you can probably reverse-engineer this GitHub entry to get where you want if you still have issues: https://stackoverflow.com/a/76557962/6712832

surzua commented 1 year ago

Just for completion, here is the code for classification algorithms. Also,from my experience, SVC is taking too long in problems with real data, so it's better to drop it from classifiers to try with this LazyClassifier.

`from sklearn.utils import all_estimators from sklearn.base import ClassifierMixin

removed_classifiers = [ "ClassifierChain", "ComplementNB", "GradientBoostingClassifier", "GaussianProcessClassifier", "HistGradientBoostingClassifier", "MLPClassifier", "LogisticRegressionCV", "MultiOutputClassifier", "MultinomialNB", "OneVsOneClassifier", "OneVsRestClassifier", "OutputCodeClassifier", "RadiusNeighborsClassifier", "VotingClassifier", 'SVC','LabelPropagation','LabelSpreading','NuSV'] classifiers_list = [est for est in all_estimators() if (issubclass(est[1], ClassifierMixin) and (est[0] not in removed_classifiers))]`