Holmeyoung / crnn-pytorch

Pytorch implementation of CRNN (CNN + RNN + CTCLoss) for all language OCR.
MIT License
378 stars 105 forks source link

inference accuracy #58

Closed Zrufy closed 4 years ago

Zrufy commented 4 years ago

I wanted to ask you how do I get the accuracy during the inference? For example, I pass a text to him, the net reads and tells me how accurately.

Zrufy commented 4 years ago

@Holmeyoung can you help me?

Holmeyoung commented 4 years ago

Maybe you want to get the confidence of the output results. If so, you can use the raw-score of the result. The bigger, the better.

Zrufy commented 4 years ago

ok but what can I compare it with to understand it is a good result? Is there a 100%?

Holmeyoung commented 4 years ago

With itself. if the score equals to 1, it will be 100%.

Zrufy commented 4 years ago

for example I with a prediction get this tensor.

`tensor([[[-1.3282e+01, -1.3920e+01, -1.5807e+01, -2.0889e+01, -1.3084e+01,
          -1.7415e+01, -4.7795e+00, -1.7587e+01, -1.5715e+01, -5.3365e+00,
          -6.4690e+00, -1.9096e+01, -2.1315e+01, -1.3012e+01, -1.4305e+01,
          -2.9076e+01, -1.5422e+01, -6.7934e-01, -2.0013e+01, -1.6298e+01,
          -1.7253e+01, -7.1176e+00, -1.5685e+01, -1.1244e+01, -1.0481e+01,
          -1.1754e+01, -9.2222e-01, -2.8651e+01, -1.9495e+01, -2.5293e+01,
          -2.6247e+01, -2.7565e+01, -2.5554e+01, -1.7686e+01, -2.2920e+01,
          -1.6853e+01, -1.0848e+01]],

        [[-1.6091e+01, -5.1685e+00, -1.3691e+01, -4.5297e+00, -1.1779e+01,
          -1.5111e+01, -2.3522e+01, -1.1473e+01, -1.1852e+01, -4.9391e+00,
          -1.2647e+01, -1.2440e+01, -1.6720e+01, -1.3343e+01, -1.3185e+01,
          -1.6600e-01, -7.4860e+00, -3.9010e+00, -1.0596e+01, -1.5283e+01,
          -1.1510e+01, -1.6725e+01, -2.3571e+00, -2.2355e+01, -3.6546e+00,
          -2.2424e+01, -1.8987e+01, -3.3895e+01, -1.6536e+01, -2.3033e+01,
          -1.8332e+01, -2.8825e+01, -1.5854e+01, -1.9391e+01, -1.7802e+01,
          -2.1142e+01, -1.3176e+01]],

        [[ 1.9732e-03, -1.8828e+01, -3.0509e+01, -2.5538e+01, -2.2492e+01,
          -2.1067e+01, -2.6731e+01, -2.5323e+01, -1.4875e+01, -1.8335e+01,
          -2.0746e+01, -2.5408e+01, -2.8690e+01, -2.3018e+01, -2.9535e+01,
          -3.1846e+01, -2.7156e+01, -2.5222e+01, -3.0866e+01, -2.5687e+01,
          -2.4893e+01, -2.7228e+01, -2.7821e+01, -2.2138e+01, -2.3910e+01,
          -2.5565e+01, -2.7434e+01, -2.0280e+01, -2.1280e+01, -2.1709e+01,
          -1.8160e+01, -3.0827e+01, -2.2248e+01, -2.5155e+01, -2.3985e+01,
          -2.9333e+01, -2.0942e+01]],

        [[ 2.3320e-03, -1.7091e+01, -2.5203e+01, -2.6242e+01, -2.3491e+01,
          -2.2458e+01, -2.7192e+01, -1.9685e+01, -2.2685e+01, -1.4332e+01,
          -1.9070e+01, -2.6630e+01, -2.9941e+01, -1.8998e+01, -3.1297e+01,
          -2.5323e+01, -2.8956e+01, -1.7909e+01, -3.2779e+01, -2.9130e+01,
          -2.5086e+01, -2.7844e+01, -2.3311e+01, -2.4212e+01, -1.8528e+01,
          -2.8098e+01, -2.7917e+01, -2.7252e+01, -2.2216e+01, -2.2682e+01,
          -1.9981e+01, -3.5246e+01, -1.9074e+01, -2.6697e+01, -2.3299e+01,
          -2.9992e+01, -1.9789e+01]],

        [[ 2.7117e-03, -1.6522e+01, -1.8523e+01, -2.1105e+01, -2.2257e+01,
          -2.4067e+01, -3.0422e+01, -1.4238e+01, -2.3798e+01, -1.6958e+01,
          -2.1875e+01, -1.7975e+01, -2.2355e+01, -1.8768e+01, -2.3074e+01,
          -1.6560e+01, -2.4011e+01, -1.6805e+01, -3.1448e+01, -2.8215e+01,
          -2.4780e+01, -2.7729e+01, -1.6999e+01, -2.2074e+01, -1.7712e+01,
          -2.7266e+01, -2.7500e+01, -2.9174e+01, -2.2525e+01, -2.3093e+01,
          -2.1292e+01, -3.3593e+01, -1.7448e+01, -2.7965e+01, -2.1578e+01,
          -2.9390e+01, -2.2183e+01]],

        [[-9.8091e-04, -2.5143e+01, -2.3918e+01, -2.2630e+01, -2.7452e+01,
          -2.5786e+01, -3.1760e+01, -2.3253e+01, -1.5384e+01, -2.5796e+01,
          -2.4649e+01, -1.9143e+01, -2.1908e+01, -2.3898e+01, -2.2633e+01,
          -2.3043e+01, -1.9910e+01, -2.9929e+01, -3.0768e+01, -3.0313e+01,
          -3.2314e+01, -3.0667e+01, -1.9596e+01, -2.2217e+01, -3.0403e+01,
          -2.6748e+01, -3.0432e+01, -2.8051e+01, -3.3033e+01, -3.1371e+01,
          -2.5237e+01, -2.8984e+01, -1.9938e+01, -3.1726e+01, -2.9838e+01,
          -3.4384e+01, -2.6494e+01]],

        [[-2.3217e-02, -1.5539e+01, -2.4626e+01, -2.5496e+01, -2.1325e+01,
          -2.1998e+01, -2.6161e+01, -1.9952e+01, -2.1157e+01, -1.2741e+01,
          -1.9441e+01, -2.9324e+01, -2.8535e+01, -1.8946e+01, -2.9238e+01,
          -2.3129e+01, -2.7386e+01, -1.7932e+01, -3.0906e+01, -2.7458e+01,
          -2.3979e+01, -2.7152e+01, -2.2794e+01, -2.3194e+01, -1.7527e+01,
          -2.7300e+01, -2.6885e+01, -2.3881e+01, -2.0754e+01, -2.0810e+01,
          -1.7760e+01, -3.3749e+01, -1.7857e+01, -2.4089e+01, -2.2185e+01,
          -2.8672e+01, -1.8792e+01]],

        [[-1.8825e+00, -1.3760e+01, -3.0788e+01, -3.6388e+01, -2.3407e+01,
          -2.5294e+01, -2.4017e+01, -4.0886e+01, -4.5798e+01, -1.9438e+01,
          -1.8746e+01, -4.4470e+01, -2.9576e+01, -2.6138e+01, -5.3812e+01,
          -2.9520e+01, -3.8276e+01, -2.3428e+01, -2.9496e+01, -3.0234e+01,
          -9.6103e+00, -2.1726e+01, -2.8816e+01, -3.6214e+01, -2.2124e+01,
          -3.7767e+01, -2.3140e+01, -2.8460e+01, -6.9547e-01, -1.1309e+01,
          -1.5290e+01, -2.7294e+01, -2.7700e+01, -1.3288e+01, -1.5726e+01,
          -1.6763e+01, -1.4176e+01]],

        [[ 1.0306e-02, -1.7975e+01, -2.9203e+01, -2.5575e+01, -2.1554e+01,
          -2.2472e+01, -2.6437e+01, -2.5771e+01, -2.1316e+01, -1.9181e+01,
          -1.8675e+01, -2.6707e+01, -2.8048e+01, -2.3251e+01, -3.0067e+01,
          -3.0828e+01, -2.8360e+01, -2.5278e+01, -2.9915e+01, -2.7291e+01,
          -2.4664e+01, -2.6468e+01, -2.8051e+01, -2.2359e+01, -2.3490e+01,
          -2.6845e+01, -2.6746e+01, -2.7472e+01, -1.9422e+01, -2.1292e+01,
          -1.8209e+01, -3.2517e+01, -2.4934e+01, -2.6315e+01, -2.6803e+01,
          -2.8860e+01, -2.4543e+01]],

        [[-1.9380e-01, -3.7308e+01, -3.1039e+01, -3.4661e+01, -3.3559e+01,
          -1.8302e+01, -2.2624e+01, -3.5519e+01, -1.4560e+01, -3.1456e+01,
          -4.0394e+01, -2.2225e+01, -2.5315e+01, -2.2233e+01, -3.8725e+01,
          -3.9779e+01, -2.5097e+01, -4.1872e+01, -2.2714e+01, -1.4032e+01,
          -3.2484e+01, -2.9127e+01, -4.2140e+01, -1.6715e+01, -3.2186e+01,
          -2.0442e+01, -3.5692e+01, -1.0495e+01, -3.5588e+01, -1.8406e+01,
          -2.3085e+01, -2.6957e+01, -2.2126e+01, -2.2345e+01, -3.2634e+01,
          -3.4413e+01, -3.0442e+01]],

        [[ 5.9555e-03, -2.6479e+01, -3.1634e+01, -2.4604e+01, -2.7688e+01,
          -2.0838e+01, -2.4971e+01, -2.8350e+01, -1.4594e+01, -2.7758e+01,
          -2.4222e+01, -2.4237e+01, -2.5864e+01, -2.4984e+01, -2.9817e+01,
          -3.4890e+01, -2.3683e+01, -3.2774e+01, -2.9522e+01, -2.3903e+01,
          -2.6075e+01, -2.8111e+01, -3.1436e+01, -1.8966e+01, -3.2235e+01,
          -2.6639e+01, -3.0505e+01, -1.8566e+01, -2.9193e+01, -2.2677e+01,
          -1.8680e+01, -2.9482e+01, -2.2824e+01, -2.3987e+01, -3.3690e+01,
          -3.5775e+01, -2.9580e+01]],

        [[-5.3069e-03, -2.0556e+01, -3.4093e+01, -3.1631e+01, -2.4291e+01,
          -2.2930e+01, -2.7840e+01, -2.8013e+01, -1.3173e+01, -1.9936e+01,
          -2.1550e+01, -2.7623e+01, -3.0102e+01, -2.4474e+01, -3.0952e+01,
          -3.8687e+01, -2.6137e+01, -2.6176e+01, -3.1722e+01, -2.7415e+01,
          -2.8921e+01, -2.9904e+01, -3.1554e+01, -1.8918e+01, -2.9656e+01,
          -2.5168e+01, -2.6215e+01, -1.9803e+01, -3.1694e+01, -3.4000e+01,
          -2.8281e+01, -3.7204e+01, -2.8797e+01, -2.6571e+01, -3.3252e+01,
          -3.0636e+01, -2.2889e+01]],

        [[-1.6001e+01, -2.5830e+01, -3.0575e+01, -4.3800e+01, -1.4088e+01,
          -3.0743e+01, -2.1233e+01, -4.3401e+01, -3.9414e+01, -3.1553e+01,
          -2.0102e+01, -4.5476e+01, -3.2822e+01, -3.1458e+01, -2.8162e+01,
          -5.2421e+01, -3.6866e+01, -1.9616e+01, -2.7181e+01, -3.0977e+01,
          -3.3336e+01, -1.6930e+01, -4.7464e+01, -2.1425e+01, -2.4246e+01,
          -1.7704e+01, -3.5969e-03, -3.9559e+01, -2.5101e+01, -2.8421e+01,
          -3.8394e+01, -3.3319e+01, -4.1198e+01, -2.1349e+01, -3.3339e+01,
          -1.7535e+01, -2.8008e+01]],

        [[ 1.5497e-03, -2.5682e+01, -3.6622e+01, -4.7485e+01, -2.1841e+01,
          -3.2128e+01, -2.2540e+01, -4.9067e+01, -4.4962e+01, -3.3373e+01,
          -2.7511e+01, -4.2541e+01, -3.3744e+01, -2.9449e+01, -3.7923e+01,
          -5.0511e+01, -4.4799e+01, -2.8018e+01, -3.3418e+01, -2.8119e+01,
          -3.0308e+01, -1.4912e+01, -4.9664e+01, -2.5631e+01, -2.9964e+01,
          -2.4598e+01, -1.1588e+01, -3.4644e+01, -2.8328e+01, -2.8424e+01,
          -3.9458e+01, -4.2098e+01, -3.9885e+01, -2.6666e+01, -3.6490e+01,
          -2.5237e+01, -3.0211e+01]],

        [[ 3.8890e-03, -2.1642e+01, -3.0919e+01, -2.9891e+01, -2.1738e+01,
          -2.2625e+01, -2.7039e+01, -3.3306e+01, -2.1211e+01, -2.6188e+01,
          -2.5095e+01, -2.8929e+01, -2.5942e+01, -2.6330e+01, -3.5666e+01,
          -3.5177e+01, -2.7094e+01, -3.1950e+01, -3.2023e+01, -2.5260e+01,
          -2.7212e+01, -2.8024e+01, -3.7710e+01, -1.9591e+01, -2.6443e+01,
          -2.6469e+01, -2.6498e+01, -1.6569e+01, -2.1460e+01, -2.0752e+01,
          -1.7367e+01, -2.5618e+01, -1.8576e+01, -2.1269e+01, -1.8932e+01,
          -2.6565e+01, -2.2074e+01]],

        [[-1.4563e+01, -3.0694e+01, -2.6460e+01, -3.8986e+01, -4.0746e+01,
          -3.4694e+01, -3.2026e+01, -3.2520e+01, -2.6916e+01, -3.6778e+01,
          -4.1600e+01, -1.8965e+01, -2.4978e+01, -3.0952e+01, -3.6509e+01,
          -2.6232e+01, -2.7977e+01, -4.5637e+01, -2.7370e+01, -2.2839e+01,
          -2.8306e+01, -2.4171e+01, -3.1806e+01, -2.9515e+01, -4.0235e+01,
          -3.2744e+01, -4.2449e+01, -1.5642e+01, -1.8082e+01, -1.3611e+01,
           6.4273e-03, -2.4231e+01, -2.1841e+01, -1.7328e+01, -3.4553e+01,
          -3.0571e+01, -2.7796e+01]],

        [[-3.1203e-02, -3.0009e+01, -2.6902e+01, -3.7735e+01, -3.7273e+01,
          -3.4896e+01, -3.2366e+01, -2.9630e+01, -2.4937e+01, -3.6246e+01,
          -3.6609e+01, -2.1375e+01, -2.6235e+01, -3.4116e+01, -3.5198e+01,
          -2.8552e+01, -2.8888e+01, -4.2418e+01, -3.0112e+01, -2.6147e+01,
          -3.1377e+01, -3.1055e+01, -3.2560e+01, -3.1227e+01, -3.7405e+01,
          -3.1245e+01, -3.6832e+01, -1.3196e+01, -1.6658e+01, -1.4554e+01,
          -2.7972e+00, -2.2686e+01, -2.0224e+01, -1.8737e+01, -3.2645e+01,
          -2.8969e+01, -2.4114e+01]],

        [[-1.6237e+01, -1.7255e+01, -3.2260e+01, -5.2932e+01, -3.4482e+01,
          -4.7714e+01, -3.7707e+01, -4.2963e+01, -4.6628e+01, -2.3824e+01,
          -2.5103e+01, -4.8031e+01, -3.9913e+01, -3.8644e+01, -6.1339e+01,
          -3.5600e+01, -5.0881e+01, -2.5865e+01, -3.6878e+01, -3.2858e+01,
          -2.6254e+01, -2.3593e+01, -4.3157e+01, -4.9193e+01, -2.5622e+01,
          -4.1176e+01, -2.8992e+01, -2.9695e+01,  5.5845e-03, -1.1873e+01,
          -1.5726e+01, -2.8385e+01, -3.0795e+01, -1.9078e+01, -1.7579e+01,
          -1.7185e+01, -1.6255e+01]],

        [[-1.5037e+01, -2.0871e+01, -3.8645e+01, -5.3076e+01, -3.4802e+01,
          -4.8244e+01, -3.8715e+01, -5.2800e+01, -4.6594e+01, -3.1351e+01,
          -2.5441e+01, -4.8697e+01, -4.0154e+01, -4.4539e+01, -6.2304e+01,
          -4.4128e+01, -5.1204e+01, -3.5246e+01, -3.7265e+01, -3.3551e+01,
          -2.7266e+01, -2.5373e+01, -4.8447e+01, -5.0262e+01, -3.3510e+01,
          -4.1124e+01, -2.9009e+01, -2.9347e+01, -9.4530e-04, -1.1481e+01,
          -1.6080e+01, -2.7890e+01, -3.8578e+01, -1.9224e+01, -2.2499e+01,
          -1.6837e+01, -2.2462e+01]],

        [[ 5.9511e-03, -2.1006e+01, -3.0355e+01, -3.5075e+01, -2.2390e+01,
          -2.4855e+01, -2.6877e+01, -3.1913e+01, -2.1439e+01, -2.3924e+01,
          -2.8265e+01, -3.1290e+01, -2.8330e+01, -2.5908e+01, -3.6611e+01,
          -3.2721e+01, -2.9967e+01, -3.1355e+01, -3.3272e+01, -2.5280e+01,
          -2.6936e+01, -2.7805e+01, -3.5193e+01, -2.3525e+01, -2.5160e+01,
          -2.5178e+01, -2.5702e+01, -1.7884e+01, -1.8587e+01, -2.1166e+01,
          -1.6023e+01, -2.5705e+01, -1.7934e+01, -2.4113e+01, -1.8581e+01,
          -2.6599e+01, -1.9306e+01]],

        [[ 8.0617e-03, -3.0095e+01, -2.7699e+01, -3.9363e+01, -2.2585e+01,
          -5.4801e+01, -5.5514e+01, -4.3385e+01, -3.7468e+01, -3.7176e+01,
          -3.5819e+01, -4.1499e+01, -2.8115e+01, -4.7555e+01, -4.1623e+01,
          -4.2372e+01, -4.5687e+01, -3.5413e+01, -3.1390e+01, -2.7071e+01,
          -4.7788e+01, -3.9727e+01, -5.7818e+01, -3.5660e+01, -2.1269e+01,
          -2.3034e+01, -3.0658e+01, -4.0358e+01, -3.3894e+01, -2.6298e+01,
          -4.6324e+01, -2.4420e+01, -2.8958e+01, -3.9137e+01, -1.4040e+01,
          -2.5412e+01, -3.4502e+01]],

        [[ 1.1428e-02, -2.1257e+01, -2.8599e+01, -3.4739e+01, -2.0271e+01,
          -4.3575e+01, -4.2446e+01, -4.0858e+01, -3.6319e+01, -3.6709e+01,
          -3.5529e+01, -3.9297e+01, -2.5103e+01, -3.4250e+01, -3.8602e+01,
          -3.6367e+01, -4.4315e+01, -3.2872e+01, -3.1619e+01, -2.6905e+01,
          -3.1083e+01, -2.9326e+01, -4.9035e+01, -2.5589e+01, -2.3065e+01,
          -2.4250e+01, -2.6216e+01, -3.7026e+01, -2.5327e+01, -2.4799e+01,
          -3.6188e+01, -2.4788e+01, -3.0013e+01, -3.6087e+01, -1.6511e+01,
          -2.5506e+01, -3.3401e+01]],

        [[ 3.7683e-03, -2.1870e+01, -2.8919e+01, -2.8322e+01, -2.1098e+01,
          -2.6169e+01, -2.8093e+01, -3.0396e+01, -1.9108e+01, -2.5555e+01,
          -2.3662e+01, -2.7081e+01, -2.4968e+01, -2.6662e+01, -3.0893e+01,
          -3.2003e+01, -2.6200e+01, -2.9727e+01, -3.2272e+01, -2.7306e+01,
          -2.8627e+01, -2.8098e+01, -3.3853e+01, -2.0254e+01, -2.6042e+01,
          -2.6280e+01, -2.5129e+01, -2.0448e+01, -2.4416e+01, -2.4029e+01,
          -2.0430e+01, -1.9753e+01, -2.0184e+01, -2.5080e+01, -1.9093e+01,
          -2.0832e+01, -2.3048e+01]],

        [[ 1.9156e-03, -2.1998e+01, -2.8870e+01, -3.0455e+01, -2.0998e+01,
          -2.9510e+01, -3.2108e+01, -3.2566e+01, -2.1046e+01, -2.7399e+01,
          -2.5734e+01, -2.8198e+01, -2.4583e+01, -2.7648e+01, -3.2964e+01,
          -3.2593e+01, -2.7089e+01, -3.2073e+01, -3.2425e+01, -2.9500e+01,
          -3.4511e+01, -3.0604e+01, -3.5853e+01, -2.0017e+01, -2.6375e+01,
          -2.5694e+01, -2.5200e+01, -2.5689e+01, -2.8563e+01, -3.0579e+01,
          -2.4572e+01, -1.6552e+01, -1.9797e+01, -2.8766e+01, -1.8932e+01,
          -1.7615e+01, -2.2914e+01]],

        [[ 4.6306e-03, -2.9970e+01, -2.7738e+01, -2.9833e+01, -1.8589e+01,
          -4.0985e+01, -3.6733e+01, -3.6620e+01, -2.2743e+01, -2.2724e+01,
          -1.1803e+01, -3.3937e+01, -2.4101e+01, -4.6711e+01, -3.2209e+01,
          -4.3228e+01, -2.7806e+01, -2.6065e+01, -3.0030e+01, -2.6903e+01,
          -4.5415e+01, -3.4347e+01, -4.2587e+01, -3.0725e+01, -2.2859e+01,
          -2.4116e+01, -2.2298e+01, -2.2556e+01, -2.0317e+01, -1.8516e+01,
          -3.2093e+01, -1.1945e+01, -2.2955e+01, -1.9752e+01, -1.0009e+01,
          -5.0183e+00, -1.5196e+01]],

        [[-1.5592e+01, -2.2137e+01, -2.9793e+01, -5.6071e+01, -2.9775e+01,
          -4.2330e+01, -3.2648e+01, -3.5926e+01, -2.9248e+01, -1.9092e+01,
          -1.9856e+01, -4.4915e+01, -3.7890e+01, -3.7403e+01, -5.0717e+01,
          -4.4038e+01, -4.1539e+01, -2.6295e+01, -2.7334e+01, -2.7578e+01,
          -3.7658e+01, -2.2369e+01, -4.6709e+01, -3.9984e+01, -2.4566e+01,
          -2.7091e+01, -2.1961e+01, -1.3962e+01, -1.5626e+01, -1.5747e+01,
          -2.8774e+01, -1.4639e+01, -2.6966e+01, -1.6752e+01, -1.4457e+01,
          -5.4946e-02, -1.0902e+01]]], grad_fn=<ViewBackward>)
tensor([17, 15,  0,  0,  0,  0,  0, 28,  0,  0,  0,  0, 26,  0,  0, 30,  0, 28,
        28,  0,  0,  0,  0,  0,  0, 35])`

How do I calculate the output value of the prediction? Thanks in the meantime for the help.

Holmeyoung commented 4 years ago

The output of the net has length 26, so you will get 26 arrays. In each array, i guess you have 37 alphabets, so the length of the array will be 37. And it will get the biggest one in the array as the label. For example, the last one, 35 is the index of the biggest value. You should get the softmax value, and if the biggest one is 1, it means 100%

Zrufy commented 4 years ago

if I do this check the max value in the last numpy array is -0.054946 . how can I take this value as an accuracy value? the letter accuracy is of -5%? @Holmeyoung

Holmeyoung commented 4 years ago

You should use softmax to make the sum to be 1. For example[1, 3] --> [0.25, 0.75]. So maybe you can say the confidence is 75%. Or you can say it's (0.72-0.25)=0.5, because [0.5, 0.5], the confidence is 0.5-0.5=0

Zrufy commented 4 years ago

Can you make me and example with this numpy array?

[-1.5592e+01, -2.2137e+01, -2.9793e+01, -5.6071e+01, -2.9775e+01,
          -4.2330e+01, -3.2648e+01, -3.5926e+01, -2.9248e+01, -1.9092e+01,
          -1.9856e+01, -4.4915e+01, -3.7890e+01, -3.7403e+01, -5.0717e+01,
          -4.4038e+01, -4.1539e+01, -2.6295e+01, -2.7334e+01, -2.7578e+01,
          -3.7658e+01, -2.2369e+01, -4.6709e+01, -3.9984e+01, -2.4566e+01,
          -2.7091e+01, -2.1961e+01, -1.3962e+01, -1.5626e+01, -1.5747e+01,
          -2.8774e+01, -1.4639e+01, -2.6966e+01, -1.6752e+01, -1.4457e+01,
          -5.4946e-02, -1.0902e+01]

for softmax can i use this function?

def softmax(x): return (np.exp(x).T / np.exp(x).sum(axis=-1)).T

if i understand i take maxvalue in this case -0.054946 pass this value in softmax and i obtain 1.0.This is the accuracy?if i take an array that give me a wrong prediction i obtain always 1.0 from this function.

Zrufy commented 4 years ago

i understand the solution!!!Thank you so much for the help!!!!