steckdenis / nnetcpp

Flexible and fast neural network library (feed-forward, RNN, GRU, LSTM, etc)
58 stars 19 forks source link

problem with prediction #5

Open jbensabat opened 5 years ago

jbensabat commented 5 years ago

Hello I compiled the code with the hard-coded data when running in prediction mode, I get always the same value after the training set see below

Could you explain what I am doing wrong ? thanks jac

-0.03367 -0.0345466 start of data used for training -0.05162 -0.0249832 -0.05785 -0.0168138 -0.05829 -0.00986043 -0.05066 -0.00393157 -0.05653 0.00115873 -0.0582 0.00558231 -0.04115 0.00949312 -0.02043 0.0130259 -0.00087 0.0162969 0.00088 0.0194043 -0.00171 0.0224282 0.00243 0.025431 0.01522 0.0284549 0.03385 0.0315202 0.0499 0.03462 0.06215 0.037714 0.06547 0.0407206 0.06494 0.043507 0.06632 0.0458803 0.07205 0.04758 0.07646 0.0482769 0.07481 0.0475827 0.0681 0.0450743 0.06729 0.0403364 0.07286 0.0330197 0.06561 0.0229061 0.04486 0.00996593 0.01604 -0.00561113 -0.00924 -0.023423 -0.03465 -0.0429004 -0.06571 -0.0633815 -0.08933 -0.0841933 -0.11226 -0.104723 -0.13533 -0.124467 -0.15354 -0.14305 -0.15968 -0.160225 -0.15558 -0.17586 -0.16089 -0.189916 -0.16801 -0.202418 -0.17019 -0.213441 -0.18657 -0.223088 -0.22277 -0.231478 -0.25849 -0.238733 -0.28191 -0.244978 -0.30395 -0.250331 -0.33891 -0.254903 -0.3847 -0.258793 -0.41923 -0.262094 -0.44245 -0.264887 -0.47085 -0.267243 -0.49645 -0.269227 -0.50708 -0.270893 -0.50851 -0.272289 -0.50828 -0.273456 -0.49741 -0.27443 -0.47335 -0.275242 -0.44446 -0.275917 -0.41713 -0.276477 -0.39397 -0.276942 -0.3721 -0.277326 -0.36676 -0.277643 -0.38291 -0.277905 -0.39999 -0.278121 -0.402 -0.278298 -0.39673 -0.278444 -0.39756 -0.278563 -0.37981 -0.278661 -0.33821 -0.278741 -0.28911 -0.278807 -0.24067 -0.27886 -0.20484 -0.278903 -0.17076 -0.278939 -0.13566 -0.278968 -0.10128 -0.278991 -0.07876 -0.27901 -0.06811 -0.279025 -0.04824 -0.279037 -0.0439 -0.279047 -0.05795 -0.279055 -0.05512 -0.279062 -0.02227 -0.279067 -0.00201 -0.279071 -0.03144 -0.279074 -0.05106 -0.279077 -0.04371 -0.279079 -0.04993 -0.279081 -0.05487 -0.279082 -0.0418 -0.279083 -0.03407 -0.279084 -0.03778 -0.279085 -0.02296 -0.279085 -0.01722 -0.279085 -0.03764 -0.279086 -0.04542 -0.279086 -0.06444 -0.279086 -0.11293 -0.279086 -0.15835 -0.279086 -0.16665 -0.279087 -0.14545 -0.279087 -0.1273 -0.279087 -0.09936 -0.279087 -0.06023 -0.279087 -0.03317 -0.279087 -0.01905 -0.279087 0.01517 -0.279087 0.05537 -0.279087 0.05926 -0.279087 0.04241 -0.279087 0.00933 -0.279087 0.01444 -0.279087 0.05665 -0.279087 0.05627 -0.279087 0.03955 -0.279087 0.02992 -0.279087 0.04654 -0.279087 0.09101 -0.279087 0.1196 -0.279087 0.1095 -0.279087 0.08591 -0.279087 0.10146 -0.279087 0.09711 -0.279087 0.04806 -0.279087 0.01745 -0.279087 -0.00624 -0.279087 -0.05097 -0.279087 -0.09354 -0.279087 -0.09771 -0.279087 -0.09505 -0.279087 -0.11365 -0.279087 -0.13552 -0.279087 -0.15621 -0.279087 -0.16899 -0.279087 -0.17307 -0.279087 -0.18547 -0.279087 -0.21603 -0.279087 -0.24129 -0.279087 -0.25401 -0.279087 -0.27697 -0.279087 -0.31584 -0.279087 -0.36555 -0.279087 -0.42198 -0.279087 -0.44669 -0.279087 -0.427 -0.279087 -0.4247 -0.279087 -0.43207 -0.279087 -0.42334 -0.279087 -0.41855 -0.279087 -0.39722 -0.279087 -0.35751 -0.279087 -0.34249 -0.279087 -0.36169 -0.279087 -0.36458 -0.279087 -0.35183 -0.279087 -0.35312 -0.279087 -0.35336 -0.279087 -0.34477 -0.279087 -0.31118 -0.279087 -0.2638 -0.279087 -0.23419 -0.279087 -0.21541 -0.279087 -0.18335 -0.279087 -0.13434 -0.279087 -0.07879 -0.279087 -0.01644 -0.279087 0.03472 -0.279087 0.05102 -0.279087 0.03221 -0.279087 0.01661 -0.279087 0.02299 -0.279087 0.01835 -0.279087 0.00895 -0.279087 0.0104 -0.279087 0.00083 -0.279087 -0.02173 -0.279087 -0.03451 -0.279087 -0.04335 -0.279087 -0.06306 -0.279087 -0.07747 -0.279087 -0.07921 -0.279087 -0.08067 -0.279087 -0.0881 -0.279087 -0.10064 -0.279087 -0.11303 -0.279087 -0.11399 -0.279087 -0.11094 -0.279087 -0.12068 -0.279087 -0.1316 -0.279087 -0.13164 -0.279087 -0.1299 -0.279087 -0.1477 -0.279087 -0.19423 -0.279087 -0.24731 -0.279087 -0.2874 -0.279087 -0.31461 -0.279087 -0.34304 -0.279087 -0.3935 -0.279087 -0.45157 -0.279087 -0.49765 -0.279087 -0.5433 -0.279087 -0.57736 -0.279087 -0.58819 -0.279087 -0.58721 -0.279087 -0.5755 -0.279087 -0.55177 -0.279087 -0.52019 -0.279087 -0.4921 -0.279087 -0.47801 -0.279087 -0.47133 -0.279087 -0.45379 -0.279087 -0.42883 -0.279087 -0.41992 -0.279087 -0.43058 -0.279087 -0.45351 -0.279087 -0.48282 -0.279087 -0.50285 -0.279087 -0.50449 -0.279087 -0.49629 -0.279087 -0.49761 -0.279087 -0.51473 -0.279087 -0.52904 -0.279087 -0.5326 -0.279087 -0.52902 -0.279087 -0.53238 -0.279087 -0.55256 -0.279087 -0.57817 -0.279087 -0.60971 -0.279087 -0.64427 -0.279087 -0.68303 -0.279087 -0.72298 -0.279087 -0.76154 -0.279087 -0.80728 -0.279087 -0.83626 -0.279087 -0.83992 -0.279087 -0.83242 -0.279087 -0.82862 -0.279087 -0.82565 -0.279087 -0.80799 -0.279087 -0.78567 -0.279087 -0.75777 -0.279087 -0.71782 -0.279087 -0.67034 -0.279087 -0.63977 -0.279087 -0.63538 -0.279087 -0.63348 -0.279087 -0.62516 -0.279087 -0.613 -0.279087 -0.60485 -0.279087 -0.59716 -0.279087 -0.59254 -0.279087 -0.58835 -0.279087 -0.57021 -0.279087 -0.54061 -0.279087 -0.49884 -0.279087 -0.4491 -0.279087 -0.40085 -0.279087 -0.36362 -0.279087 -0.33658 -0.279087 -0.31449 -0.279087 -0.29045 -0.279087 -0.2611 -0.279087 -0.2247 -0.279087 -0.17177 -0.279087 -0.11554 -0.279087 -0.05367 -0.279087