Xtra-Computing / thundersvm

ThunderSVM: A Fast SVM Library on GPUs and CPUs
Apache License 2.0
1.56k stars 217 forks source link

Memory limit in prediction? #265

Open Song-Yuqi opened 1 year ago

Song-Yuqi commented 1 year ago

Hello, I wonder if there is a parameter about max memory set for predict when using jupyter notebook like this: x1_test=X_test.drop('Pixel',inplace=False,axis=1) predictions1 = clf_svm1.predict(x1_test)

When I run the above prediction, I got the error message: Canceled future for execute_request message before replies were done The Kernel crashed while executing code in the the current cell or a previous cell. Please review the code in the cell(s) to identify a possible cause of the failure. Click here for more info. View Jupyter log for further details.

The clf_svm1 model is trained in this way: from thundersvm import SVC import joblib import os

os.environ["CUDA_VISIBLE_DEVICES"] = "0" os.environ["KMP_DUPLICATE_LIB_OK"]="TRUE"

x1=X_train.drop('Pixel',inplace=False,axis=1) y1=y_train.drop('Pixel',inplace=False,axis=1).values.ravel()

clf_svm1 = SVC(kernel = 'linear',random_state=0, probability= True, n_jobs=-1, gpu_id=0,max_mem_size=40000) clf_svm1.fit(x1,y1)

joblib.dump(clf_svm1,dirs+'/SVM1.pkl') # save models

The training successed, the model is saved, but the prediction failed. I saw someone using -m when predicting by command in terminal, but I couldn't find a parameter that can be set in the "predict" function when using jupyter. How can I fix this? Or is this error caused by another problem instead of memory?

zeyiwen commented 1 year ago

You can find the parameters on this page. Setting max_mem_size should work.

Song-Yuqi commented 1 year ago

You can find the parameters on this page. Setting max_mem_size should work.

Thank you! I just changed the max_mem_size from 40000 to 10000, it still took almost the same time to train as before, and the prediction also worked well. But I still feel a little surprised that the same parameter can work when training but not when predicting.