Closed ilyassh closed 5 years ago
Hello,
I'm trying to implement the Wasserstein GAN but i find confusing the expression you use as the wasserstein loss function.
I used 2 identical vectors and compared their earth mover distance (or 1-Wasserstein distance) using your function (slightly modified for evaluation):
def wasserstein(y_pred,y_true): return K.eval( K.mean(y_pred*y_true) ) y_pred = y_true = K.variable(np.array([[1,1],[2,2] ] )) wasserstein(y_pred,y_true) = 2.5
Knowing that for 2 identical probability distributions, the wasserstein distance equals 0, i was confused by the returned answer. In order to check my answer, i used another wasserstein function which can be computed this way:
def wasserstein_2 (y_pred, y_true): return K.eval(tf.reduce_mean(y_true) - tf.reduce_mean(y_pred))
And this one returns 0 which is what i was expecting.
Howerver, i used your function during the training and it seems to converge anyway, do you have an explanation for this ?
Thank you very much for your time :)
Ilyass
Hello,
I'm trying to implement the Wasserstein GAN but i find confusing the expression you use as the wasserstein loss function.
I used 2 identical vectors and compared their earth mover distance (or 1-Wasserstein distance) using your function (slightly modified for evaluation):
def wasserstein(y_pred,y_true): return K.eval( K.mean(y_pred*y_true) ) y_pred = y_true = K.variable(np.array([[1,1],[2,2] ] )) wasserstein(y_pred,y_true) = 2.5
Knowing that for 2 identical probability distributions, the wasserstein distance equals 0, i was confused by the returned answer. In order to check my answer, i used another wasserstein function which can be computed this way:
def wasserstein_2 (y_pred, y_true): return K.eval(tf.reduce_mean(y_true) - tf.reduce_mean(y_pred))
And this one returns 0 which is what i was expecting.
Howerver, i used your function during the training and it seems to converge anyway, do you have an explanation for this ?
Thank you very much for your time :)
Ilyass