topepo / caret

caret (Classification And Regression Training) R package that contains misc functions for training and plotting classification and regression models
http://topepo.github.io/caret/index.html
1.6k stars 635 forks source link

Inappropriate network architecture of one neuron per layer in dnn #459

Closed dashaub closed 7 years ago

dashaub commented 7 years ago

The dnn models from the "deepnet" package will not produce sensible models when any of the layers are of size 1. This results in many wasted training rounds going towards models that can't produce predictions. Here is a reproducible example:

# Initial configuration and prepare spam dataset
library(caret)
library(kernlab)
data(spam)
seed <- 3434
set.seed(seed)
trInd <- createDataPartition(spam$type, p = 0.8, list = FALSE)
dat <- spam[trInd, ]
testing <- spam[-trInd, ]
nrounds <- 300
ctrl <- trainControl(method = "repeatedcv",
                     number = 5,
                     repeats = 3,
                     search = "random", verboseIter = TRUE)

# Fit the 300 models. This takes a few hour on my computer
# If reproducibility isn't a concern, register a parallel cluster or reduce nrounds/repeats
set.seed(53)
dnnMod <- train(type ~ ., data = dat,
                method = "dnn",
                tuneLength = nrounds,
                trControl = ctrl,
                preProcess = c("center", "scale"),
                learningrate_scale = 0.99,
                numepochs = 10
)

This produces NaN accuracy metrics when at least one layer has 0 or 1 neurons:

> dnnMod
Stacked AutoEncoder Deep Neural Network

3682 samples
  57 predictors
   2 classes: 'nonspam', 'spam'

Pre-processing: centered (57), scaled (57)
Resampling: Cross-Validated (5 fold, repeated 3 times)
Summary of sample sizes: 2946, 2946, 2944, 2946, 2946, 2946, ...
Resampling results across tuning parameters:

  layer1  layer2  layer3  hidden_dropout  visible_dropout  Accuracy
   1       0       5      0.0303532045    0.0555206668           NaN
   1       3      10      0.0960047701    0.0447998974           NaN
   1       7       6      0.0807323362    0.0539364793           NaN
   1      11      16      0.0195732087    0.0054741996           NaN
   1      11      18      0.0595457288    0.0427085490           NaN
   1      12       0      0.0460228673    0.0138835733           NaN
   1      12       7      0.0964683202    0.0398542636           NaN
   1      13      14      0.0418495717    0.0856799686           NaN
   1      13      20      0.0711885021    0.0772274121           NaN
   1      14      15      0.0972542390    0.0920863069           NaN
   1      16      17      0.0847285447    0.0231470994           NaN
   1      17       5      0.0746642481    0.0777320329           NaN
   2       3       3      0.0686997386    0.0630431575     0.7434886
   2       7      18      0.0173265673    0.0563232687     0.6442106
   2       8      17      0.0133598291    0.0701903094     0.6260295
   2      12       9      0.0061151537    0.0367232730     0.8813886
   2      13      12      0.0266208650    0.0719709794     0.8677364
   2      13      20      0.0200229623    0.0995198656     0.6681134
   2      14       2      0.0040629404    0.0907437931     0.6334570
   2      14       6      0.0623943682    0.0441147273     0.7721706
   2      14      20      0.0242287333    0.0066426198     0.7440274
   2      14      20      0.0536645514    0.0391730980     0.6565297
   2      15      11      0.0980921378    0.0628546715     0.7243084
   2      16       1      0.0605730169    0.0728218432     0.6648882
   2      16      16      0.0826333246    0.0938874916     0.6798338
   2      17       9      0.0270237025    0.0903392394     0.7288098
   2      20      16      0.0596033858    0.0896549990     0.6059208
   3       1      16      0.0480999398    0.0798874773           NaN
   3       3      17      0.0344363164    0.0270008700     0.7264911
   3       3      20      0.0547936246    0.0870254179     0.6843135
   3       8      17      0.0628187845    0.0135157251     0.7455947
   3       9      13      0.0898620321    0.0581724677     0.8046254
   3      12       1      0.0638796735    0.0779516777     0.7919946
   3      15       5      0.0285995792    0.0924419798     0.7010295
   3      15       9      0.0730303400    0.0104426297     0.7457912
   3      15      10      0.0778545300    0.0921427914     0.6437831
   3      15      18      0.0943426471    0.0810549916     0.6704003
   3      16       0      0.0799159766    0.0591416152     0.8334341
   3      16       9      0.0256862130    0.0613935983     0.6255766
   3      17      12      0.0312051173    0.0737770012     0.7054532
   3      17      13      0.0657062478    0.0692675602     0.6465911
   3      18       4      0.0879884311    0.0045707638     0.7053188
   4       0      17      0.0340292917    0.0341190463     0.6910683
   4       1       0      0.0599374291    0.0854206727     0.9157098
   4       1      15      0.0407069403    0.0634070453           NaN
   4       3      18      0.0973241243    0.0162406221     0.6771550
   4       7       0      0.0252670887    0.0910375625     0.9239490
   4       7       3      0.0183407091    0.0522277151     0.8565036
   4       9      17      0.0742405773    0.0534069530     0.6617845
   4       9      20      0.0733135570    0.0532177786     0.6595440
   4      12       2      0.0763417148    0.0008380174     0.8048695
   4      14       0      0.0929064913    0.0562838447     0.8950616
   4      14       2      0.0827641716    0.0811955024     0.6567458
   4      16       7      0.0790674150    0.0465439239     0.6805963
   4      17      10      0.0396531230    0.0268010394     0.6626237
   4      18      14      0.0108793038    0.0061934378     0.6059208
   4      19       2      0.0003019263    0.0069020029     0.6059208
   4      19      18      0.0725245283    0.0544870582     0.6414280
   4      20      15      0.0254865304    0.0608403908     0.6059208
   5       1      13      0.0597089679    0.0693810188           NaN
   5       3      16      0.0800283108    0.0385884422     0.6965408
   5       5       4      0.0684478910    0.0985489759     0.7752875
   5       6       8      0.0001567535    0.0167806323     0.8442148
   5       8      17      0.0264661069    0.0173892777     0.6256404
   5      10      14      0.0300449792    0.0890354940     0.6272397
   5      12      14      0.0254141537    0.0385190712     0.6656547
   5      19       6      0.0161334573    0.0324477997     0.6118085
   5      20      16      0.0175572739    0.0082649597     0.6440549
   6       3       7      0.0697702109    0.0131508637     0.8059856
   6       3      14      0.0123291219    0.0628619234     0.6579135
   6       3      20      0.0515973381    0.0058781257     0.6854241
   6       5       6      0.0977208965    0.0526541232     0.7400097
   6       5       9      0.0414443023    0.0538027052     0.8115772
   6       5       9      0.0990682493    0.0671198774     0.7853782
   6       7      13      0.0092083705    0.0058026344     0.6446615
   6       8      12      0.0254348582    0.0552359091     0.7070824
   6       8      15      0.0746142820    0.0817301623     0.7189365
   6      10       9      0.0933980624    0.0506252453     0.7922557
   6      10      18      0.0737401119    0.0613679722     0.6494896
   6      12       2      0.0620109796    0.0003813255     0.7294339
   6      16       0      0.0309213851    0.0309030363     0.7870078
   6      17       2      0.0400327184    0.0370308052     0.6059208
   6      18      11      0.0982213833    0.0483987762     0.6059208
   6      19       6      0.0327138680    0.0434681124     0.6059208
   6      19      18      0.0113939154    0.0582117148     0.6059208
   6      20       0      0.0207339396    0.0718490553     0.6611160
   7       0      13      0.0608022664    0.0431672892     0.9164355
   7       1       6      0.0779356175    0.0359339394           NaN
   7       1       8      0.0418610705    0.0411907115           NaN
   7       2       2      0.0448866416    0.0032449579     0.6693914
   7       2       9      0.0695161640    0.0536240368     0.7601439
   7       6      18      0.0880208917    0.0973949573     0.6341817
   7       8      19      0.0741921572    0.0964305997     0.6500331
   7       9      18      0.0380355645    0.0952321177     0.6236503
   7      10      12      0.0840414832    0.0998469410     0.7590497
   7      10      19      0.0053045405    0.0787711311     0.6281430
   7      16      16      0.0310382724    0.0221608245     0.6087288
   7      18      11      0.0021169064    0.0902610013     0.6059208
   7      19      15      0.0518740154    0.0512119500     0.6252143
   8       4      13      0.0924646899    0.0206246812     0.7280339
   8       6      19      0.0684296646    0.0161689887     0.6546527
   8       7      17      0.0205260882    0.0911833879     0.7003809
   8       9      17      0.0942516443    0.0429712285     0.6414280
   8      11       8      0.0047160539    0.0655811106     0.7135939
   8      11      18      0.0197309939    0.0496291904     0.6263012
   8      12      20      0.0873736237    0.0459064620     0.6673795
   8      13       9      0.0997396317    0.0437973853     0.7357171
   8      14      20      0.0784777524    0.0492312790     0.6444878
   8      16      16      0.0770195977    0.0547046280     0.6421937
   8      18      15      0.0140701076    0.0993019235     0.6241273
   8      18      17      0.0319769129    0.0714006229     0.6059208
   9       0       0      0.0721906190    0.0579634242     0.9201481
   9       0      15      0.0844384923    0.0021630321     0.8110475
   9       3      13      0.0576010384    0.0052721751     0.7721394
   9       4       4      0.0261000815    0.0006999862     0.8168911
   9       5      20      0.0148706099    0.0288077496     0.6059208
   9       6      10      0.0264407041    0.0914993551     0.8760466
   9       7       3      0.0535369525    0.0534398045     0.8546808
   9      11       5      0.0660226804    0.0099512940     0.8580376
   9      11       9      0.0371662059    0.0280699856     0.8199135
   9      14      17      0.0136976121    0.0378362084     0.6799746
   9      19       8      0.0105444717    0.0263507846     0.6059208
   9      19      14      0.0880596952    0.0853915355     0.6442360
  10       1      11      0.0049721607    0.0285792225           NaN
  10       1      11      0.0191210685    0.0544996783           NaN
  10       3      10      0.0116520541    0.0459254144     0.8666063
  10       4      11      0.0565525898    0.0352019826     0.8540115
  10       5      13      0.0664916910    0.0826930871     0.6943661
  10       6      20      0.0418361512    0.0195010644     0.6455947
  10       7      13      0.0483821644    0.0756593243     0.7901209
  10       8       0      0.0526392685    0.0742307182     0.9144421
  10      10      17      0.0784661271    0.0601981114     0.6219316
  10      13       6      0.0733139041    0.0554361527     0.8493579
  10      13      12      0.0920993958    0.0909604860     0.6968381
  10      14       7      0.0111527961    0.0006241793     0.6472221
  10      20       6      0.0460304679    0.0439916489     0.6407940
  10      20       7      0.0276610511    0.0235314498     0.6256672
  11       0       9      0.0746418726    0.0597347851     0.9137200
  11       1      16      0.0845148505    0.0938224666           NaN
  11       2       3      0.0052845639    0.0940797451     0.8031009
  11       3      17      0.0312363966    0.0558310705     0.6343628
  11       4      12      0.0200169914    0.0049058395     0.6789177
  11       4      12      0.0755565116    0.0273495020     0.7963688
  11       6       0      0.0675391498    0.0228400049     0.9128137
  11       6       8      0.0088893389    0.0487192508     0.9032152
  11       7       2      0.0340321284    0.0352193271     0.7615763
  11       7       9      0.0069759586    0.0590788451     0.8449922
  11       8       5      0.0537643366    0.0637622646     0.8526724
  11      13       6      0.0204741132    0.0459670924     0.7844357
  11      14      19      0.0604449473    0.0032672165     0.6436925
  11      16       6      0.0454816580    0.0360916703     0.6355627
  11      16      10      0.0098163042    0.0115870824     0.6059208
  11      16      13      0.0015527493    0.0464762107     0.6059208
  12       0      11      0.0744550871    0.0872632808     0.9149894
  12       1      13      0.0556459437    0.0383128788           NaN
  12       1      20      0.0774373681    0.0767671066           NaN
  12       2       6      0.0355450430    0.0584669822     0.7825183
  12       3      13      0.0621167971    0.0516891461     0.7572985
  12       4       3      0.0585194130    0.0115501592     0.7192539
  12       4      13      0.0342631596    0.0356894415     0.7709545
  12       8       5      0.0046644636    0.0298937593     0.8480462
  12      11       5      0.0756802929    0.0858984735     0.8155267
  12      13      11      0.0557452575    0.0204424400     0.7447118
  12      14       7      0.0721624149    0.0709915829     0.7374634
  12      14      15      0.0562589691    0.0258596478     0.7368357
  12      14      16      0.0513946798    0.0174856102     0.6610575
  12      14      18      0.0027618685    0.0583758085     0.6246200
  12      17       6      0.0176217205    0.0519520806     0.6578622
  12      19      12      0.0220767074    0.0132015073     0.6059208
  12      19      19      0.0280886921    0.0714095731     0.6059208
  13       0       7      0.0516209833    0.0828226525     0.9171621
  13       1       1      0.0929063394    0.0271899439           NaN
  13       2      19      0.0104408093    0.0735992586     0.6059208
  13       3       6      0.0199490499    0.0450293374     0.8572016
  13       3      12      0.0559329693    0.0173069874     0.8065564
  13       3      20      0.0200646421    0.0800219824     0.6445722
  13       5       8      0.0420392147    0.0341077526     0.8892784
  13       7       1      0.0889114213    0.0549556896     0.6762529
  13      12       4      0.0450805360    0.0530496172     0.8409008
  13      12      17      0.0406522157    0.0394117523     0.6278411
  13      15      13      0.0670106694    0.0685165448     0.6248520
  13      15      13      0.0815757827    0.0372224281     0.6141635
  13      15      19      0.0635948794    0.0754252026     0.6275107
  13      16      19      0.0041493081    0.0738671472     0.6059208
  13      17      12      0.0572167142    0.0341665950     0.6059208
  13      19      14      0.0521202399    0.0682723747     0.6246454
  13      20      17      0.0417225970    0.0237854125     0.6059208
  14       3       4      0.0179660699    0.0158905599     0.8126573
  14       3       5      0.0983824676    0.0706315536     0.6581641
  14       4       6      0.0430562126    0.0414827880     0.8528807
  14       4       9      0.0709175287    0.0161102774     0.7936401
  14       5       1      0.0929241179    0.0460523076     0.6490367
  14       8      10      0.0566217174    0.0340054306     0.8389174
  14      10       7      0.0546674240    0.0409806787     0.8622010
  14      11      16      0.0529987719    0.0385201949     0.6713403
  14      13      16      0.0299533488    0.0080484367     0.6625729
  14      14       2      0.0806897891    0.0769345532     0.7132245
  14      14      13      0.0211947500    0.0612084589     0.6907053
  14      14      17      0.0427373804    0.0986259251     0.6730809
  14      15      11      0.0137486665    0.0049725479     0.6059208
  14      17       2      0.0487946264    0.0748302079     0.6059208
  14      17      17      0.0909255469    0.0111221906     0.6059208
  14      19      11      0.0098669993    0.0528258273     0.6339617
  15       1      18      0.0364563622    0.0840852625           NaN
  15       3       6      0.0984845618    0.0531792429     0.7161007
  15       4      19      0.0737111056    0.0861002411     0.6769741
  15       5      17      0.0242374249    0.0863713467     0.6059208
  15       6      14      0.0699593368    0.0423911722     0.8166321
  15      11       3      0.0366491644    0.0490186202     0.8601273
  15      15       2      0.0828619710    0.0910614491     0.6768447
  15      18      17      0.0026940911    0.0379645858     0.6059208
  15      20      12      0.0473564541    0.0937460207     0.6326418
  16       0      10      0.0173741976    0.0321686052     0.8996096
  16       1       0      0.0705010520    0.0521992320     0.8683519
  16       1      13      0.0065911863    0.0442507861           NaN
  16       2       4      0.0170522182    0.0478100315     0.8463072
  16       2      11      0.0254064502    0.0136916826     0.8671081
  16       2      12      0.0929704381    0.0645426366     0.6564643
  16       2      14      0.0091229738    0.0937373532     0.6282834
  16       4      19      0.0008452923    0.0141929839     0.6059208
  16       6       2      0.0178411888    0.0340891263     0.8264638
  16       7      11      0.0495682528    0.0237018071     0.8318027
  16       7      20      0.0769539345    0.0207789044     0.6493758
  16       9      12      0.0331446289    0.0019707318     0.6583664
  16      10       8      0.0081516500    0.0232491577     0.7994519
  16      10      13      0.0887463718    0.0256169741     0.7658850
  16      13       6      0.0726421454    0.0660597328     0.7780618
  16      15       0      0.0150044309    0.0738163555     0.7562247
  16      15      12      0.0112615023    0.0102724504     0.6059208
  16      16      10      0.0034168094    0.0473798062     0.6059208
  16      17      14      0.0054974459    0.0664625807     0.6093582
  16      18      13      0.0188289405    0.0887616599     0.6429679
  17       1      14      0.0299919159    0.0719246997           NaN
  17       3       2      0.0716673137    0.0332756045     0.7380370
  17       3      16      0.0522062810    0.0484245665     0.6615939
  17       5       6      0.0661407853    0.0295777171     0.8549477
  17       5      14      0.0003257760    0.0874005463     0.6059208
  17       9       3      0.0820388033    0.0817159669     0.8183122
  17       9      10      0.0918459306    0.0122017376     0.8522535
  17       9      18      0.0324852671    0.0312233052     0.6588621
  17      10      13      0.0740833557    0.0175186961     0.7798035
  17      11       1      0.0422744220    0.0786485722     0.7770825
  17      11      16      0.0589684833    0.0842745525     0.6763414
  17      11      20      0.0008738054    0.0428028703     0.6059208
  17      12      11      0.0478993203    0.0917421855     0.7755570
  17      12      16      0.0094793868    0.0702229943     0.6375331
  17      13       3      0.0869884331    0.0530811133     0.8270876
  17      14       8      0.0388763855    0.0168894768     0.7852133
  17      15       3      0.0946782758    0.0627616733     0.7097924
  17      15       4      0.0789401192    0.0930735366     0.6991495
  17      15      15      0.0743014272    0.0425923801     0.7238424
  17      18      18      0.0617271199    0.0110055080     0.6444172
  17      19       7      0.0008763322    0.0288322742     0.6059208
  17      19      11      0.0877533202    0.0212608549     0.6310909
  18       1       7      0.0415568856    0.0255481140           NaN
  18       1      12      0.0088156353    0.0006957839           NaN
  18       1      17      0.0374594161    0.0977085875           NaN
  18       4       2      0.0471210893    0.0038087334     0.8185715
  18       5       1      0.0551754689    0.0369461109     0.7443640
  18       5       8      0.0042202065    0.0445607063     0.8561401
  18       8      13      0.0691126265    0.0551276063     0.7564067
  18       9       1      0.0186632466    0.0085342061     0.7613119
  18      14       8      0.0511331375    0.0331108151     0.7755047
  18      18       4      0.0716535661    0.0282682250     0.6928054
  18      20       3      0.0141740455    0.0932986124     0.6059208
  19       0       2      0.0885856061    0.0915150470     0.8595785
  19       2      19      0.0585384400    0.0390051762     0.6388689
  19       2      19      0.0847604913    0.0173078812     0.6388616
  19       3       5      0.0699046707    0.0332542929     0.8173287
  19       3       9      0.0340431205    0.0854914270     0.8423772
  19       5      18      0.0208339741    0.0126523562     0.6235838
  19       6      17      0.0496309707    0.0919940796     0.6756453
  19       8      14      0.0811413218    0.0879446328     0.7628332
  19       9      15      0.0692716730    0.0273537979     0.8370292
  19      10      12      0.0197504764    0.0473038070     0.6794131
  19      11       0      0.0399920690    0.0391475215     0.8577722
  19      11       9      0.0330635822    0.0614877459     0.8388652
  19      11      15      0.0605796908    0.0429613550     0.8104945
  19      13      18      0.0673047388    0.0593715972     0.7108410
  19      14      19      0.0224251035    0.0464998627     0.6215911
  19      15      10      0.0707951224    0.0055281003     0.7362860
  19      15      15      0.0623914229    0.0045650311     0.7691322
  20       0       3      0.0408897889    0.0378048397     0.8430188
  20       1       6      0.0758628458    0.0705682486           NaN
  20       5       1      0.0638693323    0.0623755265     0.7737685
  20       5      16      0.0298601225    0.0449336085     0.7132600
  20       6       8      0.0571531197    0.0867720321     0.8396825
  20       9       1      0.0426940512    0.0465432032     0.8013455
  20       9       9      0.0761736724    0.0171565686     0.8305372
  20      11       7      0.0151890761    0.0080497191     0.8332448
  20      11      11      0.0067563578    0.0188904993     0.7282264
  20      12       6      0.0966161279    0.0746993553     0.8378492
  20      13       4      0.0491811626    0.0829958878     0.8347796
  20      13      17      0.0544408576    0.0287742489     0.7446134
  20      14       1      0.0247825198    0.0534558174     0.6976292
  20      14      15      0.0147640387    0.0469349012     0.6763484
  20      16       0      0.0112414388    0.0718858912     0.6490367
  20      16      18      0.0151465736    0.0304482571     0.6059208
  20      19       5      0.0054206408    0.0859775582     0.6059208
  20      20       5      0.0647960211    0.0870728425     0.6454135
  Kappa
          NaN
          NaN
          NaN
          NaN
          NaN
          NaN
          NaN
          NaN
          NaN
          NaN
          NaN
          NaN
  0.368247158
  0.102836415
  0.053845681
  0.736802072
  0.697809417
  0.171863701
  0.076904147
  0.449116486
  0.369041478
  0.137961572
  0.322352365
  0.157862103
  0.199302861
  0.326934904
  0.000000000
          NaN
  0.321389671
  0.209810797
  0.373786668
  0.530115530
  0.500128977
  0.255977834
  0.375989799
  0.102216453
  0.176317467
  0.603264810
  0.052861584
  0.265553205
  0.111965856
  0.274865324
  0.224941182
  0.821722601
          NaN
  0.192147011
  0.840474066
  0.668553179
  0.150633851
  0.145472197
  0.530562011
  0.767398211
  0.138198702
  0.203927482
  0.155003742
  0.000000000
  0.000000000
  0.096128925
  0.000000000
          NaN
  0.244707632
  0.455951471
  0.635430140
  0.052588182
  0.056208696
  0.164506262
  0.017766045
  0.102713270
  0.535990477
  0.141761922
  0.214956300
  0.364962804
  0.552770994
  0.487589335
  0.104264373
  0.276793007
  0.306040181
  0.506655261
  0.119822658
  0.333851568
  0.483285679
  0.000000000
  0.000000000
  0.000000000
  0.000000000
  0.148014903
  0.823787940
          NaN
          NaN
  0.170626192
  0.417404124
  0.078001733
  0.121062724
  0.048004734
  0.420573454
  0.058161423
  0.008858725
  0.000000000
  0.051690951
  0.333763503
  0.134308441
  0.254084888
  0.096797218
  0.290306486
  0.054117943
  0.172833313
  0.354309736
  0.107514643
  0.098249157
  0.049442279
  0.000000000
  0.831878245
  0.551543252
  0.451777269
  0.565801290
  0.000000000
  0.725721126
  0.671676801
  0.682800415
  0.578239438
  0.209393469
  0.000000000
  0.102821318
          NaN
          NaN
  0.702791292
  0.666957015
  0.240000487
  0.107819965
  0.497163372
  0.819320564
  0.044090468
  0.662150310
  0.253261637
  0.114464055
  0.095560652
  0.052826010
  0.818110145
          NaN
  0.539204477
  0.078260482
  0.202261710
  0.523513136
  0.815901494
  0.793932562
  0.421765029
  0.643135256
  0.663091116
  0.481382986
  0.104285717
  0.080974209
  0.000000000
  0.000000000
  0.820912040
          NaN
          NaN
  0.481943147
  0.410468516
  0.312317943
  0.454389489
  0.648775797
  0.570716074
  0.379798115
  0.361878270
  0.366423743
  0.149568316
  0.050350191
  0.141071319
  0.000000000
  0.000000000
  0.825478213
          NaN
  0.000000000
  0.678723672
  0.547403314
  0.103763109
  0.762761285
  0.200518188
  0.630360427
  0.062319927
  0.054702305
  0.024122325
  0.060905374
  0.000000000
  0.000000000
  0.050556550
  0.000000000
  0.564115034
  0.143548718
  0.672554048
  0.514488725
  0.120445149
  0.631518935
  0.694263079
  0.186106502
  0.156876228
  0.297658655
  0.236094590
  0.184092715
  0.000000000
  0.000000000
  0.000000000
  0.078918940
          NaN
  0.304238527
  0.194643526
  0.000000000
  0.575865030
  0.691600073
  0.197008558
  0.000000000
  0.074838219
  0.785439060
  0.709870439
          NaN
  0.650643307
  0.719955376
  0.143172055
  0.063493182
  0.000000000
  0.596601918
  0.623822013
  0.120564832
  0.143093294
  0.529938927
  0.444179485
  0.472291830
  0.403807004
  0.000000000
  0.000000000
  0.010334123
  0.100436045
          NaN
  0.367342328
  0.158608505
  0.683206388
  0.000000000
  0.589046040
  0.683369064
  0.149084468
  0.485104241
  0.475340647
  0.196730961
  0.000000000
  0.478546419
  0.087221664
  0.613899080
  0.498794944
  0.287741865
  0.260121260
  0.337283543
  0.111955866
  0.000000000
  0.070997923
          NaN
          NaN
          NaN
  0.590264185
  0.392489555
  0.686526148
  0.418779679
  0.426935998
  0.476953969
  0.243745076
  0.000000000
  0.698029085
  0.092977772
  0.098759816
  0.591220111
  0.655057246
  0.048212957
  0.198505155
  0.441553041
  0.648400319
  0.208196849
  0.694968376
  0.646051457
  0.570760541
  0.295679229
  0.043934784
  0.363342781
  0.461662877
  0.661435211
          NaN
  0.470492356
  0.302754830
  0.660802400
  0.555048807
  0.628759379
  0.633587407
  0.349186853
  0.648639256
  0.638561563
  0.394116643
  0.258082467
  0.200215061
  0.124323936
  0.000000000
  0.000000000
  0.113034241

Obligatory sessionInfo()

> sessionInfo()
R version 3.2.2 (2015-08-14)
Platform: x86_64-pc-linux-gnu (64-bit)
Running under: Red Hat Enterprise Linux Server release 6.7 (Santiago)

locale:
 [1] LC_CTYPE=en_US.UTF-8       LC_NUMERIC=C
 [3] LC_TIME=en_US.UTF-8        LC_COLLATE=en_US.UTF-8
 [5] LC_MONETARY=en_US.UTF-8    LC_MESSAGES=en_US.UTF-8
 [7] LC_PAPER=en_US.UTF-8       LC_NAME=C
 [9] LC_ADDRESS=C               LC_TELEPHONE=C
[11] LC_MEASUREMENT=en_US.UTF-8 LC_IDENTIFICATION=C

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base

other attached packages:
[1] deepnet_0.2      kernlab_0.9-22   caret_6.0-70     ggplot2_2.0.0
[5] lattice_0.20-33  data.table_1.9.6

loaded via a namespace (and not attached):
 [1] Rcpp_0.12.2        magrittr_1.5       splines_3.2.2      MASS_7.3-43
 [5] munsell_0.4.2      colorspace_1.2-6   foreach_1.4.3      minqa_1.2.4
 [9] stringr_1.0.0      car_2.1-1          plyr_1.8.3         tools_3.2.2
[13] parallel_3.2.2     nnet_7.3-10        pbkrtest_0.4-4     grid_3.2.2
[17] gtable_0.1.2       nlme_3.1-121       mgcv_1.8-7         quantreg_5.19
[21] e1071_1.6-7        class_7.3-13       MatrixModels_0.4-1 iterators_1.0.8
[25] lme4_1.1-10        Matrix_1.2-2       nloptr_1.0.4       reshape2_1.4.1
[29] codetools_0.2-14   stringi_1.0-1      compiler_3.2.2     scales_0.3.0
[33] stats4_3.2.2       SparseM_1.7        chron_2.3-47

This can be fixed by changing the allowable neuron sizes for each layer to be >=2 as in PR #458. While 0 is an acceptable depth so long as at least one other layer is >= 2, this would be a more complicated sampling implementation to avoid this possibility entirely, so I'm not sure supporting networks shallower than 3 is worthwhile here. The point of this package is deep learning after all, and nnet can be used for single hidden layer models.

Alternatively, sampling could be done in c(0, 2:20). The probability of all three layers being zero is very low, so this would allow exploration of shallower architectures while drastically reducing the number of models that are fit with inappropriate number of neurons. Thoughts on these two options?

As a final--and separate issue actually--we could expand the network to a depth of four. When combined with sampling in c(0, 2:20), inappropriate network architectures should almost never be sampled.

zachmayer commented 7 years ago

I vote using c(2:20) and going up to 4. It's called "deepnet" after all!

dashaub commented 7 years ago

@zachmayer I'd like to see that too, although it's a slightly more disruptive change. We should probably do some performance tests to see how much slower these networks will fit.

dashaub commented 7 years ago

@zachmayer After putting together a simple benchmark, sae.dnn.train is throwing errors when the network depth == 4.

library(microbenchmark)
xdat <- as.matrix(dat[names(dat) != "type"])
ydat <- ifelse(dat$type == "spam", 1L, 0L)
epochs <- 10
set.seed(3)
microbenchmark(sae.dnn.train(x = xdat, y = ydat, hidden = c(2, 2, 2), numepochs = epochs),
               sae.dnn.train(x = xdat, y = ydat, hidden = c(2, 2, 2, 2), numepochs = epochs),
               sae.dnn.train(x = xdat, y = ydat, hidden = c(5, 5, 5), numepochs = epochs),
               sae.dnn.train(x = xdat, y = ydat, hidden = c(5, 5, 5, 5), numepochs = epochs),
               sae.dnn.train(x = xdat, y = ydat, hidden = c(20, 20, 20), numepochs = epochs),
               sae.dnn.train(x = xdat, y = ydat, hidden = c(20, 20, 20, 20), numepochs = epochs),
               times = 5)
topepo commented 7 years ago

I've merged in the PR.

@dashaub You should contact the package maintainer about that