The official code repository for the second edition of the O'Reilly book Generative Deep Learning: Teaching Machines to Paint, Write, Compose and Play.
This repo provides instructions on how to set up GCP cloud VM instance with GPU to run examples.
I would like to recommend to take it further and use GKE Autopilot for GPU workloads instead of VMs.
Some benefits are:
GKE Autopilot's pay-per-use model ensures cost efficiency. Applying workloads via kubectl apply is simple, and pod deletion when idle is effortless.
Leverage service-based load balancing to expose Jupyter Lab, eliminating the need for port forwarding.
Maintenance/upgrades are managed seamlessly by GKE Autopilot, freeing users from routine system upkeep.
Adopting Kubernetes, a scalable and industry-standard platform, equips readers with practical experience, setting them ahead of a docker compose on a VM setup.
This is how I deployed the examples to GKE Autopilot:
Build and deploy docker image:
IMAGE=<your_image> # you can also skip this step and use bulankou/gdl2:20230715 that I build
docker build -f ./docker/Dockerfile.gpu -t $IMAGE .
docker push $IMAGE .
Create GKE Autopilot cluster with all default settings.
Apply the following K8s manifest (kubectl apply -f <yaml>) . Make sure to update <IMAGE> below. Also note cloud.google.com/gke-accelerator: "nvidia-tesla-t4" and autopilot.gke.io/host-port-assignment annotation, that ensure that we pick the right node type as well as enable host port on Autopilot.
This repo provides instructions on how to set up GCP cloud VM instance with GPU to run examples. I would like to recommend to take it further and use GKE Autopilot for GPU workloads instead of VMs. Some benefits are:
docker compose
on a VM setup.This is how I deployed the examples to GKE Autopilot:
kubectl apply -f <yaml>
) . Make sure to update<IMAGE>
below. Also notecloud.google.com/gke-accelerator: "nvidia-tesla-t4"
andautopilot.gke.io/host-port-assignment
annotation, that ensure that we pick the right node type as well as enable host port on Autopilot.