Closed nosacapital closed 1 year ago
Hi @nosacapital,
Thank you for reporting the issue. Please try to increase the total_time_limit
value, the default is 3600 seconds, and this might not be enough for a large dataset.
@pplonski
Thank you for your response, and just letting you know that what you suggested worked! And yes, thank you for a wonderful library, most appreciated.
I am currently trying out mljar on Numerai data. The data is quite large and trying to run a training operation brings about the Kernel dying. By the way, the kernel I am using is the Python 3.9.13. One message I got at the start of any training operation was this: Numerical issues were encountered when centering the data and might not be solved. Dataset may contain too large values. You may need to prescale your features. Numerical issues were encountered when scaling the data and might not be solved. The standard deviation of the data is probably very close to 0.
I then scaled the data as well as run a PCA to reduce the number of features. I still got the message above but the training ran to its conclusion. I have tried to use shorter model run times to ensure I am on the right path before committing to longer and more comprehensive runs. On achieving the training objective, I tried to test for prediction. On running the code I get the AutoML Exception error in title of message. Here is the code as follows:
Please assist in this matter. Thanks