High performance, easy-to-use, and scalable machine learning (ML) package, including linear model (LR), factorization machines (FM), and field-aware factorization machines (FFM) for Python and CLI interface.
result:
[------------] Epoch Train log_loss Test log_loss Test AUC Time cost (sec)
[ 10% ] 1 nan nan 0.500000 11.07
[ 20% ] 2 nan nan 0.500000 11.18
[ 30% ] 3 nan nan 0.500000 11.50
[ 40% ] 4 nan nan 0.500000 10.99
[ 50% ] 5 nan nan 0.500000 10.72
[ 60% ] 6 nan nan 0.500000 11.90
[ 70% ] 7 nan nan 0.500000 11.87
[ 80% ] 8 nan nan 0.500000 11.12
[ 90% ] 9 nan nan 0.500000 11.02
[ 100% ] 10 nan nan 0.500000 11.16
when using linear model, the result is fine.
so what's the problem? wish your help, :)
// train_x shape: (51960, 7193), train_y shape: (51960,) // test_x shape: (22269, 7193), test_y shape: (22269,) xdm_train = xl.DMatrix(train_x.toarray(), train_y)
xdm_test = xl.DMatrix(test_x.toarray(), test_y) fm_model = xl.create_fm()
fm_model.setTrain(xdm_train) fm_model.setValidate(xdm_test) xparams = { 'task': 'binary', 'metric': 'auc', 'lr': 0.02, 'lambda': 0.0002, 'epoch': 5,
'opt': 'FTRL',
} fm_model.fit(xparams, 'xmodel.out')
result: [------------] Epoch Train log_loss Test log_loss Test AUC Time cost (sec) [ 10% ] 1 nan nan 0.500000 11.07 [ 20% ] 2 nan nan 0.500000 11.18 [ 30% ] 3 nan nan 0.500000 11.50 [ 40% ] 4 nan nan 0.500000 10.99 [ 50% ] 5 nan nan 0.500000 10.72 [ 60% ] 6 nan nan 0.500000 11.90 [ 70% ] 7 nan nan 0.500000 11.87 [ 80% ] 8 nan nan 0.500000 11.12 [ 90% ] 9 nan nan 0.500000 11.02 [ 100% ] 10 nan nan 0.500000 11.16
when using linear model, the result is fine. so what's the problem? wish your help, :)