nlintz / TensorFlow-Tutorials

Simple tutorials using Google's TensorFlow Framework
6k stars 1.51k forks source link

Modify 11_gay #84

Closed lanpay-lulu closed 7 years ago

lanpay-lulu commented 7 years ago

Implement a working version of gan based on the original 11_gan.py.

Main modifications are as follows:

lanpay-lulu commented 7 years ago

Why the Travis CI build failed? Any help?

hunkim commented 7 years ago

I'll look into it.

On Sat, Mar 4, 2017 at 11:06 AM, lanpay-lulu notifications@github.com wrote:

Why the Travis CI build failed? Any help?

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/nlintz/TensorFlow-Tutorials/pull/84#issuecomment-284122513, or mute the thread https://github.com/notifications/unsubscribe-auth/AA3DV3Vbn3cvzwBWp4U22U4ujN1WZpSQks5riNUxgaJpZM4MS7hu .

hunkim commented 7 years ago

Overall, I like to have simple examples. Do you think you can simplify your PR and make only necessary changes?

hunkim commented 7 years ago

Add `+1e-9' or something?

Sung

On Wed, Mar 15, 2017 at 12:26 PM, lanpay-lulu notifications@github.com wrote:

@lanpay-lulu commented on this pull request.

In 11_gan.py https://github.com/nlintz/TensorFlow-Tutorials/pull/84#discussion_r106079397 :

def generate_z(n=1): return np.random.normal(size=(n, z_size))

sample = G(z)

Objective functions

-G_objective = -tf.reduce_mean(tf.log(D(G(z))))

When D(G(z))==0, log(0) will cause exception.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/nlintz/TensorFlow-Tutorials/pull/84#discussion_r106079397, or mute the thread https://github.com/notifications/unsubscribe-auth/AA3DV8kCJttzVPEovjBcj5UqoP9VqkWIks5rl1plgaJpZM4MS7hu .

lanpay-lulu commented 7 years ago

I still insist this is the simplest gan version. And tf.nn.sigmoid_cross_entropy_with_logits is the most common function in tf that everyone need to know, and to use. This is the version that really generate something meaningful. That`s why I strongly recommend it. ^ ^

hunkim commented 7 years ago

D out is 1, right? In this case, could you use just logistic loss? (not cross entropy?)

hunkim commented 7 years ago

Anyway, I am ready to merge this. Do you have anything to push in this change?

lanpay-lulu commented 7 years ago

That`s all to merge for this version. We can continue to upgrade it in the future. ^ ^

hunkim commented 7 years ago

Sounds good.