kaz-Anova / StackNet

StackNet is a computational, scalable and analytical Meta modelling framework
MIT License
1.32k stars 344 forks source link

can't figure this error out? #72

Closed AdityaSoni19031997 closed 5 years ago

AdityaSoni19031997 commented 6 years ago

@kaz-Anova @jq (sorry for @ )

 prediction to has failed due to 2
printing prediction to  preds.csv has failed due to null
 predicting on test data lasted : 323.916000
Exception in thread "main" java.lang.reflect.InvocationTargetException
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
        at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
        at java.base/java.lang.reflect.Method.invoke(Unknown Source)
        at org.eclipse.jdt.internal.jarinjarloader.JarRsrcLoader.main(JarRsrcLoader.java:58)
Caused by: java.lang.NullPointerException
        at stacknetrun.runstacknet.main(runstacknet.java:745)
        ... 5 more

Command on cmd(Anaconda's Terminal in a Virtual env)

java -jar StackNet.jar train task=classification train_file=train.csv test_file=test.csv params=params.txt pred_file=preds0.csv
test_target=false has_head=true verbose=true Threads=4 folds=3 metric=auc --output_name=model_params --indices_name=k_folds --seed=10 --include_target=True

Here's my params.txt

XgboostClassifier booster:gbtree num_round:1500 eta:0.03 max_leaves:0 gamma:.6 max_depth:8 min_child_weight:1.0 subsample:0.9 colsample_bytree:0.75 colsample_bylevel:0.85 lambda:1.0 alpha:1.0 seed:1 threads:4 bags:1 verbose:true
SklearnknnClassifier seed:1 usedense:true use_scale:false distance:cityblock metric:uniform n_neighbors:10 thread:4 verbose:true
KerasnnClassifier loss:categorical_crossentropy standardize:true use_log1p:true shuffle:true batch_normalization:true weight_init:lecun_uniform momentum:0.9 optimizer:sgd use_dense:true l2:0.1,0.1 hidden:30,20 activation:relu,relu droupouts:0.2,0.1 epochs:250 lr:0.01 batch_size:32 stopping_rounds:10 validation_split:0.2 seed:1 verbose:true
SklearnknnClassifier seed:1 usedense:true use_scale:false distance:cosine metric:uniform n_neighbors:10 thread:4 verbose:true
KerasnnClassifier loss:categorical_crossentropy standardize:true use_log1p:false shuffle:true batch_normalization:true weight_init:lecun_uniform momentum:0.9 optimizer:sgd use_dense:true l2:0.001,0.001 hidden:30,15 activation:relu,relu droupouts:0.3,0.1 epochs:250 lr:0.01 batch_size:32 stopping_rounds:10 validation_split:0.2 seed:1 verbose:true

XgboostClassifier booster:gbtree num_round:1000 eta:0.03 max_leaves:0 gamma:.6 max_depth:8 min_child_weight:1.0 subsample:0.9 colsample_bytree:0.75 colsample_bylevel:0.85 lambda:1.0 alpha:1.0 seed:1 threads:4 bags:1 verbose:true
KerasnnClassifier loss:categorical_crossentropy standardize:true use_log1p:false shuffle:true batch_normalization:true weight_init:lecun_uniform momentum:0.9 optimizer:sgd use_dense:true l2:0.001,0.001 hidden:50,25 activation:relu,relu droupouts:0.25,0.15 epochs:250 lr:0.01 batch_size:32 stopping_rounds:10 validation_split:0.2 seed:1 verbose:true

I had verified that the repo is the latest version, and other proper installations as well..

I am on Win 10 x64, 8 gigs 4 logical Cores.

The weird part is on removing the xgb, it works absolutely perfect ( I am having xgb at the end of the 2nd level)

gdragone1 commented 5 years ago

same problem. Would you fix it?

gdragone1 commented 5 years ago

@goldentom42 Sorry to @, I want to know can you fix it. Thank you!

goldentom42 commented 5 years ago

I don't think I can really. Are you on win 10 as well ? I usually run things on Ubuntu :(

gdragone1 commented 5 years ago

Run on Ubuntu too, but this mistake also happened.

gdragone1 commented 5 years ago

Maybe I know the reason. Because I always failed to generate the sparse files, then I delete a part of data from test file, but I forgot to delete the pred_file. That is , I didn't keep data consistent between test and pred file. I will try again.

goldentom42 commented 5 years ago

@gdragone1 Thanks. I'll have trouble reproducing this since my processing power is dedicated to Malware competition ;)