tensorflow / addons

Useful extra functionality for TensorFlow 2.x maintained by SIG-addons
Apache License 2.0
1.69k stars 610 forks source link

TensorFlow Addons Wind Down #2807

Open seanpmorgan opened 1 year ago

seanpmorgan commented 1 year ago

Dear contributors and users of TensorFlow Addons,

As many of you know, TensorFlow Addons (TFA) is a repository of community maintained and contributed extensions for TensorFlow, first created in 2018 and maintained by the SIG-Addons community. Over the course of 4 years, 200 contributors have built the TFA repository into a community owned and managed success that is being utilized by over 8,000 github repositories according to our dependency graph. I’d like to take a moment to sincerely thank everyone involved as a contributor or community member for their efforts.

Recently, there has been increasing overlap in contributions and scope between TFA and the Keras-CV and Keras-NLP libraries. To prevent future overlap, we believe that new and existing addons to TensorFlow will be best maintained in Keras project repositories, where possible.

Decision to Wind Down TensorFlow Addons

We believe that it is in the best interest of the TensorFlow community to consolidate where TensorFlow extensions can be utilized, maintained and contributed. Because of this, it is bittersweet that we are announcing our plans to move TensorFlow Addons to a minimal maintenance and release mode.

TFA SIG Addons will be ending development and introduction of new features to this repository. TFA will be transitioning to a minimal maintenance and release mode for one year in order to give appropriate time for you to adjust any dependencies to the overlapping repositories in our TensorFlow community (Keras, Keras-CV, and Keras-NLP). Going forward, please consider contributing to the Keras-CV and Keras-NLP projects.

Background:

The original RFC proposal for TFA was dated 2018-12-14 with the stated goal of building a community managed repository for contributions that conform to well-established API patterns, but implement new functionality not available in core TensorFlow as defined in our Special Interest Group (SIG) charter.

As the years have progressed, new repositories with healthy contributor communities (Keras-CV, Keras-NLP, etc.) have been created with similar goals to ours and the criteria for contribution acceptance overlaps significantly (e.g. number of required citations). Additionally, since Keras split out of core TensorFlow in 2020, the barrier for community contribution has been substantially lowered.

Understandably, there has been increasing ambiguity regarding where contributions should land and where they will be best maintained. Many features that are available in TFA are simultaneously available in other TensorFlow Community repositories. As just a few examples:

Random Cutout: TFA & Keras-CV AdamW Optimizer: TFA & Keras Multihead Attention: TFA & Keras

As part of the original RFC, our Special Interest Group agreed to migrate code from tf.contrib and keras.contrib repositories. In doing so, TFA inherited C++ custom-ops, which made TFA a unique place in the TensorFlow community to contribute C++ custom ops to be built and distributed. However, we’ve recently helped in migrating much of that infrastructure to Keras-CV so that they can compile and distribute custom ops as they see fit.

What’s Next:

seanpmorgan commented 1 year ago

As with everything else on this community led repository... please share your thoughts, concerns, etc. so we can facilitate a healthy discussion.

AakashKumarNain commented 1 year ago

@seanpmorgan yeah this is a hard decision but we always wanted to reduce those overlaps, and this makes sense. I just want to take a moment to thank you, the other maintainers, and the community members for their efforts! 🍻

foxik commented 1 year ago

First, I wanted to thank the maintainers of TensorFlow Addons for their job, I have been using TF Addons happily for several years now! Thanks!

I use the following functionality of TF Addons that are currently not in any TF/Keras package, as far as I know:

ayaderaghul commented 1 year ago

Sorry I am newbie but please put "InstanceNormalization" into normal tf layers, is this okay?

MarkDaoust commented 1 year ago

@ayaderaghul

https://www.tensorflow.org/api_docs/python/tf/keras/layers/GroupNormalization

Relation to Instance Normalization: If the number of groups is set to the input dimension (number of groups is equal to number of channels), then this operation becomes identical to Instance Normalization.
ayaderaghul commented 1 year ago

@ayaderaghul

https://www.tensorflow.org/api_docs/python/tf/keras/layers/GroupNormalization

Relation to Instance Normalization: If the number of groups is set to the input dimension (number of groups is equal to number of channels), then this operation becomes identical to Instance Normalization.

So, does it mean that in the case of color image (number of channels = 3) ,we set the number of groups to 3?

MarkDaoust commented 1 year ago

@ayaderaghul that's what it sounds like to me.

ayaderaghul commented 1 year ago

I have a project using pix2pix in tensorflow_examples, similar to this: (https://www.tensorflow.org/tutorials/generative/cyclegan)

In this project, the model is pix2pix, with InstanceNorm

generator_g = pix2pix.unet_generator(OUTPUT_CHANNELS, norm_type='instancenorm')

When I train the model, save the model, and then continue to train, it returns an error saying something like "No InstanceNormalization". I have to install tensorflow_addons, add the InstanceNormalization like a custom object to load the model.

I have tried the following alternatives but they don't work:

Please help!

Technologicat commented 1 year ago

Is there a replacement for tfa.optimizers.CyclicalLearningRate?

Searching the internet turned up bckenstler/CLR, which is many years old, and its newer fork brianmanderson/Cyclical_Learning_Rate, which is also two years old, but no officially preferred solution.

CLR isn't that complicated, so if it's not part of any actively maintained package, I'll probably just grab a local copy of the class into my project tree.

jazzycap commented 12 months ago

Will the public document indicate how to replace each functions of tfa with alternatives of keras ? Example: RectifiedAdam optimizers , Lookahead optimizer

sainathadapa commented 10 months ago

Is the MultiOptimizer available elsewhere?

Doniach28 commented 8 months ago

Hi,

Is there any equivalent to dense_image_warp in KerasCV/Keras?

hansk0812 commented 5 months ago

TFA meant a lot to me. Thank you.