mdabros / SharpLearning

Machine learning for C# .Net
MIT License
384 stars 85 forks source link

Training and testing on validation set works, but single double[] to predict all provide the same answer #155

Closed pjsgsy closed 5 months ago

pjsgsy commented 5 months ago

Hi - Firstly, thank you for sharing this excellent work. Proper Neural stuff for .net, especially with documentation still seems rare, despite ML.NET, etc.

I have a standard NN defined with an input layer like this

net.Add(new InputLayer(width: 100, height: 100, depth: 1))

I read from a CSV the 1d array (1 per line) using the same code as in the samples. A small csv for validation, a larger one for training. This seems to work fine, loss decreases during training, etc. The test on the validation set after provides a reasonable result. Again, trained and validated using the same code as in the NN classification example. I just changed the shape of the input layer and provided my data

However, after training, when I try to send a single double[] to model.PredictProbability(observation) to get single predictions during run time, I always get back the same exact answer (class) and probability, despite the input changing!

I've really no idea what I am doing wrong here. I appreciate it is a bit of a newbie question and quite possibly unconnected with your code, but if anyone has any suggestion as to where to start to look, I'd appreciate it.

Here is a training run

Sharplearning-master.. Reading data.. Preprocessing data.. Creating NN.. Training.. Using MKL Provider Iteration: 001 - Loss 0.58076 - Time (ms): 23375 Iteration: 002 - Loss 0.54662 - Time (ms): 25110 Iteration: 003 - Loss 0.50457 - Time (ms): 24165 Iteration: 004 - Loss 0.46674 - Time (ms): 24013 Iteration: 005 - Loss 0.41939 - Time (ms): 25494 Iteration: 006 - Loss 0.38809 - Time (ms): 26673 Iteration: 007 - Loss 0.35927 - Time (ms): 25950 Iteration: 008 - Loss 0.33010 - Time (ms): 25026 Iteration: 009 - Loss 0.31003 - Time (ms): 24538 Iteration: 010 - Loss 0.29387 - Time (ms): 24504 Testing predictions on validation set.. Test Error: 0.35867565910484367 Saving model.. Model saved..

but single queries (truncated for display) and results

P:message 'predict;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;' P:observation 0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0; S:PRR;1;0.44102612137794495 R:7500f0af-905b-454a-983d-8241e8d21229 msg : 127.0.0.1:24017 : predict;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;13 P:message 'predict;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;13' P:observation 0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0; S:PRR;1;0.44102612137794495 R:7500f0af-905b-454a-983d-8241e8d21229 msg : 127.0.0.1:24017 : predict;0;0;0;0;0;0;0;0;0;0;0;0;0;0;269; P:message 'predict;0;0;0;0;0;0;0;0;0;0;0;0;0;0;269;' P:observation 0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0;0; S:PRR;1;0.44102612137794495

fwiw - I have verified that during the metric test, different results are being returned from the validation data.

Any pointers are appreciated! Thanks.

pjsgsy commented 5 months ago

OK - Figured it out. It was in fact a code issue my side! Sorry for the hassle. The compliment still stands, though :) Great library!