skymanaditya1 / Sound_Forensics_ASR

Code related to Sound Forensics and Automatic Speech Recognition.
2 stars 0 forks source link

Python neural network #14

Closed skymanaditya1 closed 9 years ago

skymanaditya1 commented 9 years ago

Implement the python neural network for calculating the emission probabilities of the states in the Hidden Markov Model. The current implementation of the emission probability of the Hmm is using the Gaussian Mixture Model (GMM). This has to be changed to the neural networks to obtain better probabilities and also improve the accuracy of classification.

skymanaditya1 commented 9 years ago

Implementation of the Python Neural networks will involve the calculating of the sigmoidal function. The sigmoidal function will be calculated for the given feature matrix and the weight matrix. The feature matrix and the weight matrix are represented using the theta notation and the X notation. Also an image representing the sample of neural networks also has to be implemented.

skymanaditya1 commented 9 years ago

Implementation of Neural Network using Forward Propagation and Back Propagation. The Neural Network can be implemented as follows: Back propagation algorithm can be implemented as follows:

  1. Giving m samples {(x1,y1), (x2, y2), .... , (xm, ym)}.
  2. Set the delta for each i, j and l as equal to 0.
  3. For all the m training samples ,
  4. Set the units of the first layer equal to the inputs xi
  5. Perform forward propagation, calculate a(l), which is the activation of units i in the layer j for all the layers in the neural network.
  6. Using the output values y(i) calculate the error rate (dell) values by subtracting the activation units of the output layer from the observed output values.
  7. Using the dell (L) , error rate of the last layer, compute the dell values for the layers L-1, L-2, ,,,, 2, without computing the error rate of the first layer (i.e the input layer).
  8. Compute the dell values using the error rates and the activation of units in layer l, the error rate values are from the layer l+1 values.
  9. Compute the values of D for the case where j!=0 (not a bias unit), and for the case j=0 (for a bias unit). Calculate the partial derivative of J(theta) values.