keras-team / keras-io

Keras documentation, hosted live at keras.io
Apache License 2.0
2.69k stars 2.01k forks source link

Updated Variational AutoEncoder example for Keras 3 #1836

Closed sitamgithub-MSIT closed 2 months ago

sitamgithub-MSIT commented 2 months ago

This PR updates the Variational AutoEncoder Keras 3.0 example [TF Only Backend]. All TF ops are replaced with corresponding Keras ops.

For example, here is the notebook link provided: https://colab.research.google.com/drive/15VMQIEUK8jhqI8nYyFSfWDh9fsMMs116?usp=sharing

cc: @fchollet

The following describes the Git difference for the changed files:

Changes: ``` diff --git a/examples/generative/vae.py b/examples/generative/vae.py index d1d195c9..3396f4f7 100644 --- a/examples/generative/vae.py +++ b/examples/generative/vae.py @@ -18,6 +18,7 @@ os.environ["KERAS_BACKEND"] = "tensorflow" import numpy as np import tensorflow as tf import keras +from keras import ops from keras import layers """ @@ -30,10 +31,10 @@ class Sampling(layers.Layer): def call(self, inputs): z_mean, z_log_var = inputs - batch = tf.shape(z_mean)[0] - dim = tf.shape(z_mean)[1] - epsilon = tf.random.normal(shape=(batch, dim)) - return z_mean + tf.exp(0.5 * z_log_var) * epsilon + batch = ops.shape(z_mean)[0] + dim = ops.shape(z_mean)[1] + epsilon = keras.random.normal(shape=(batch, dim)) + return z_mean + ops.exp(0.5 * z_log_var) * epsilon """ @@ -94,14 +95,14 @@ class VAE(keras.Model): with tf.GradientTape() as tape: z_mean, z_log_var, z = self.encoder(data) reconstruction = self.decoder(z) - reconstruction_loss = tf.reduce_mean( - tf.reduce_sum( + reconstruction_loss = ops.mean( + ops.sum( keras.losses.binary_crossentropy(data, reconstruction), axis=(1, 2), ) ) - kl_loss = -0.5 * (1 + z_log_var - tf.square(z_mean) - tf.exp(z_log_var)) - kl_loss = tf.reduce_mean(tf.reduce_sum(kl_loss, axis=1)) + kl_loss = -0.5 * (1 + z_log_var - ops.square(z_mean) - ops.exp(z_log_var)) + kl_loss = ops.mean(ops.sum(kl_loss, axis=1)) total_loss = reconstruction_loss + kl_loss grads = tape.gradient(total_loss, self.trainable_weights) self.optimizer.apply_gradients(zip(grads, self.trainable_weights)) (END) ```
sitamgithub-MSIT commented 2 months ago

LGTM, thank you -- please add the generated files.

Absolutely! The generated files have been added.