nogibjj / mlops-template

mlops template
MIT License
193 stars 137 forks source link
copilot coursera huggingface mlops python

CI Codespaces Prebuilds

Template for MLOPs projects with GPU

CONDA IS NOT NEEDED AS A PACKAGE MANAGER. All setup is done using the Python Software Foundation recommended tools: virtualenv and pip and mainstream production tools Docker. Please see PEP 453 "officially recommend the use of pip as the default installer for Python packages"

GitHub Codespaces are FREE for education and as are GPU Codespaces as of this writing in December 2022

  1. First thing to do on launch is to open a new shell and verify virtualenv is sourced.

Things included are:

Two fun tools to explore:

Try out Bento

docker run -it --rm -p 8888:8888 -p 3000:3000 -p 3001:3001 bentoml/quickstart:latest

Verify GPU works

The following examples test out the GPU (including Docker GPU)

Additionally, this workspace is setup to fine-tune Hugging Face

fine-tune

python hugging-face/hf_fine_tune_hello_world.py

Verify containerized GPU works for Tensorflow

Because of potential versioning conflicts between PyTorch and Tensorflow it is recommended to run Tensorflow via GPU Container and PyTorch via default environment.

See TensorFlow GPU documentation

More Tensorflow GPU Ideas

https://www.tensorflow.org/resources/recommendation-systems

# Deploy the retrieval model with TensorFlow Serving
docker run -t --rm -p 8501:8501 \
  -v "RETRIEVAL/MODEL/PATH:/models/retrieval" \
  -e MODEL_NAME=retrieval tensorflow/serving &

Setup Docker Toolkit NVidia

mlops-tensorflow-gpu

Used in Following Projects

Used as the base and customized in the following Duke MLOps and Applied Data Engineering Coursera Labs: