enggen / Deep-Learning-Coursera

Deep Learning Specialization by Andrew Ng, deeplearning.ai.
1.65k stars 1.33k forks source link

Week 4 Programming Exercise: ValueError: shapes (4,5) and (4,4) not aligned: 5 (dim 1) != 4 (dim 0) #9

Open arilwan opened 4 years ago

arilwan commented 4 years ago
# GRADED FUNCTION: L_model_forward

def L_model_forward(X, parameters):
    """
    Implement forward propagation for the [LINEAR->RELU]*(L-1)->LINEAR->SIGMOID computation

    Arguments:
    X -- data, numpy array of shape (input size, number of examples)
    parameters -- output of initialize_parameters_deep()

    Returns:
    AL -- last post-activation value
    caches -- list of caches containing:
                every cache of linear_activation_forward() (there are L-1 of them, indexed from 0 to L-1)
    """

    caches = []
    A = X
    L = len(parameters) // 2                  # number of layers in the neural network
    #print(parameters)
    # Implement [LINEAR -> RELU]*(L-1). Add "cache" to the "caches" list.
    for l in range(1, L):
        A_prev = A 
        ### START CODE HERE ### (≈ 2 lines of code)
        A, cache = linear_activation_forward(A_prev, parameters['W'+str(1)], parameters['b'+str(1)], activation='relu')
        caches.append(cache)
        ### END CODE HERE ###

    # Implement LINEAR -> SIGMOID. Add "cache" to the "caches" list.
    ### START CODE HERE ### (≈ 2 lines of code)
    AL, cache = linear_activation_forward(A, parameters['W'+str(L)], parameters['b'+str(L)], activation='sigmoid')
    caches.append(cache)
    ### END CODE HERE ###

    assert(AL.shape == (1,X.shape[1]))

    return AL, caches

And then:

X, parameters = L_model_forward_test_case_2hidden()
AL, caches = L_model_forward(X, parameters)
print("AL = " + str(AL))
print("Length of caches list = " + str(len(caches)))

Error: ValueError: shapes (4,5) and (4,4) not aligned: 5 (dim 1) != 4 (dim 0)

Any dea how to fix this?

ramyasusarla commented 4 years ago

getting the same error have you figured out how to fix it!

eshtaranyal commented 4 years ago

A, cache = linear_activation_forward(A_prev, parameters['W'+str(1)], parameters['b'+str(1)], activation='relu') Replace above line with A, cache = linear_activation_forward(A, parameters['W'+str(1)], parameters['b'+str(1)], activation='relu')

krsnadas919 commented 3 years ago

There is no problem with A_prev, the problem is with '1', since you are looping it should be 'l'