apple / tensorflow_macos

TensorFlow for macOS 11.0+ accelerated using Apple's ML Compute framework.
Other
3.67k stars 308 forks source link

LookupError: gradient registry has no entry for: MLCLayerNormGrad #248

Open boluoyu opened 3 years ago

boluoyu commented 3 years ago

when I run GAN on my M1 Mac , I got the error :


 /Users/boluoyu/WorkSpace/PycharmProjects/ai/test/wgan_gp_models.py:135 train_d  *
        grad = t.gradient(cost, self.D.trainable_variables)
    /Users/boluoyu/miniforge3/envs/tf2/lib/python3.8/site-packages/tensorflow/python/eager/backprop.py:1080 gradient  **
        flat_grad = imperative_grad.imperative_grad(
    /Users/boluoyu/miniforge3/envs/tf2/lib/python3.8/site-packages/tensorflow/python/eager/imperative_grad.py:71 imperative_grad
        return pywrap_tfe.TFE_Py_TapeGradient(
    /Users/boluoyu/miniforge3/envs/tf2/lib/python3.8/site-packages/tensorflow/python/eager/backprop.py:151 _gradient_function
        grad_fn = ops._gradient_registry.lookup(op_name)  # pylint: disable=protected-access
    /Users/boluoyu/miniforge3/envs/tf2/lib/python3.8/site-packages/tensorflow/python/framework/registry.py:98 lookup
        raise LookupError(

    LookupError: gradient registry has no entry for: MLCLayerNormGrad

code:


  @tf.function
    def train_d(self, x_real):
        z = random.normal((self.batch_size, 1, 1, self.z_dim))
        with tf.GradientTape() as t:
            x_fake = self.G(z, training=True)
            fake_logits = self.D(x_fake, training=True)
            real_logits = self.D(x_real, training=True)
            cost = ops.d_loss_fn(fake_logits, real_logits)
            gp = self.gradient_penalty(partial(self.D, training=True), x_real, x_fake)
            cost += self.grad_penalty_weight * gp
        grad = t.gradient(cost, self.D.trainable_variables)  #error 
        self.d_opt.apply_gradients(zip(grad, self.D.trainable_variables))
        return cost

On my Intel Mac, it works.