tensorflow / tfx-addons

Developers helping developers. TFX-Addons is a collection of community projects to build new components, examples, libraries, and tools for TFX. The projects are organized under the auspices of the special interest group, SIG TFX-Addons. Join the group at http://goo.gle/tfx-addons-group
Apache License 2.0
125 stars 64 forks source link

TFX + PyTorch Example #156

Open hanneshapke opened 2 years ago

hanneshapke commented 2 years ago

There are a few TFX examples for how to train Scikit learn or JAX models, I haven't seen an example pipeline for PyTorch.

The pipeline could use a known dataset, e.g. MNIST, ingest the data via the CSVExampleGen, run the standard statistics and schema steps, performs a pseudo transformation (passthrough of the values) with the new PandasTransform component from tfx-addons, add a custom run_fn function for PyTorch, and then add a TFMA example.

Any thoughts?

hanneshapke commented 2 years ago

Proposal for the TFX Addons Example: https://github.com/tensorflow/tfx-addons/pull/157

sayakpaul commented 2 years ago

Yes, please. If possible, let's demonstrate it with a model from Hugging Face with PT backend.

rcrowe-google commented 1 year ago

One of the things that we will need for this is an ONNX extractor for Evaluator. Maybe we should break that out as a separate project?

sayakpaul commented 1 year ago

Could you elaborate this a bit more?

rcrowe-google commented 1 year ago

One of the things that we will need for this is an ONNX extractor for Evaluator. Maybe we should break that out as a separate project?

Could you elaborate this a bit more?

My understanding is that for PyTorch developers ONNX is a normal format for saving trained models, while TF's SavedModel format introduces friction. For non-SavedModel models Evaluator needs an Extractor in order to generate predictions to measure. For example, the one for Sklearn and the one for XGBoost

sayakpaul commented 1 year ago

ONNX is definitely used but I am not sure that is a normal one like you mentioned. This document gives a good rundown of the serialization semantics in PyTorch: https://pytorch.org/docs/stable/notes/serialization.html

ONNX is definitely quite popularly used there (PyTorch has a direct ONNX exporter too). From what I am gathering here is that we make ONNX the serialization format for the PyTorch models to make them work in a TFX pipeline. Is that so?

rcrowe-google commented 1 year ago

... we make ONNX the serialization format for the PyTorch models

My thought is more one of the serialization formats, which to me suggests that breaking it out as a separate project might make sense. We could also do Extractors for TensorRT, TorchScript, or whatever makes sense (and here I'm displaying my ignorance about what makes sense) and let users choose the one they need.

sayakpaul commented 1 year ago

Got it. Yeah I concur with your thoughts now.

Moreover, the reason it might make even more sense is because users might want to choose an Extractor in accordance with their deployment infra. For example, ONNX might be better for CPU-based deployment while TensorRT would be better suited for a GPU-based runtime (although ONNX can handle TensorRT as a backend as well).

hanneshapke commented 1 year ago

I think Wihan wrote a custom TFMA extractor for PyTorch. We had everything done up to the trainer when we shared the notebook with Wihan. Last time, we talked he was in the process of cleaning up his implementation. He said it worked end-to-end.

rcrowe-google commented 1 year ago

I think Wihan wrote a custom TFMA extractor for PyTorch. We had everything done up to the trainer when we shared the notebook with Wihan. Last time, we talked he was in the process of cleaning up his implementation. He said it worked end-to-end.

@wihanbooyse - That would be great! It might make sense to refactor the example to break out the extractor separately, and follow that up with some more extractors for other formats.