🙌Kart of 232+ projects based on machine learning, deep learning, computer vision, natural language processing and all. Show your support by ✨ this repository.
WHAT I HAD DONE : In this project first I performed a exploratory data analysis on the Pricelist dataset which includes of data cleaning , data manipulation, data preprocessing , data visualization and after that I did the model building using different machine learning classification and regression algorithms and then predicted the accuracy of every model . In the model prediction part I used total 10 machine learning algos . In each algo I had included the accuracy score , training score , classification report , confusion matrix . While in the EDA part I have included different plots for the different visualizations of our dataset .
MODELS USED : I used total 10 machine leaning algos in the dataset
Logistic Regression : 62.40 %
KNeighborsClassifier : 60.46%
SVC : 62.45 %
Naive Bayes : 60.26%
DECISION TREE CLASSIFIER : 70.07%
RandomForestClassifier : 74.77%
AdaBoostClassifier : 34.29%
Gradient Boosting Classifier : 73.90%
ExtraTreesClassifier : 78.7%%
Bagging Classifier : 98.47%
LIBRARIES:
PANDAS
NUMPY
MATPLOTLIB
SEABORN
SCIPY
SKLEARN
CONCLUSION :
So we get a good accuracy score of 98.47 % using the Bagging Classifier.
Define You:
PROJECT TITLE : Fragnance Price Prediction
GOAL : To predict the price of the fragnances .
DATASET : https://www.kaggle.com/aryantiwari123/pricelistcsv
WHAT I HAD DONE : In this project first I performed a exploratory data analysis on the Pricelist dataset which includes of data cleaning , data manipulation, data preprocessing , data visualization and after that I did the model building using different machine learning classification and regression algorithms and then predicted the accuracy of every model . In the model prediction part I used total 10 machine learning algos . In each algo I had included the accuracy score , training score , classification report , confusion matrix . While in the EDA part I have included different plots for the different visualizations of our dataset .
MODELS USED : I used total 10 machine leaning algos in the dataset
Logistic Regression : 62.40 %
KNeighborsClassifier : 60.46%
SVC : 62.45 %
Naive Bayes : 60.26%
DECISION TREE CLASSIFIER : 70.07%
RandomForestClassifier : 74.77%
AdaBoostClassifier : 34.29%
Gradient Boosting Classifier : 73.90%
ExtraTreesClassifier : 78.7%%
Bagging Classifier : 98.47%
LIBRARIES:
PANDAS
NUMPY
MATPLOTLIB
SEABORN
SCIPY
SKLEARN
CONCLUSION :
So we get a good accuracy score of 98.47 % using the Bagging Classifier.