DiogenesAnalytics / autoencoder

Python library implementing various autoencoders.
https://colab.research.google.com/github/DiogenesAnalytics/autoencoder/blob/master/notebooks/demo/anomaly_detection.ipynb
MIT License
0 stars 0 forks source link

Docker: Add GPU Support for Tensorflow #12

Open DiogenesAnalytics opened 9 months ago

DiogenesAnalytics commented 9 months ago

Problem

Currently the Docker image being built does not have the correct drivers to work with the version of tensorflow pinned in the pyproject.toml file: https://github.com/DiogenesAnalytics/autoencoder/blob/ca2c35b185c0c106b282d6bea6077aa7ba04c3cf/pyproject.toml#L20

Solution

What seems to work is installing tensorflow[and-cuda] via pip according to the Tensorflow install documentation. Now the question becomes: how to handle this optional install with the autoencoder library (which uses poetry as its build backend)?

References

DiogenesAnalytics commented 9 months ago

Implemented (finally) with: 1c7b1f6a158eec754b7652d9d949bef77f91b700

DiogenesAnalytics commented 9 months ago

Problem

Installing the and-conda extras (i.e. as tensorflow[and-conda]) makes the resulting Docker image MASSIVE: image

Solution

Need to work with Poetry docs to figure out the correct way to make these CUDA libs optional ...

DiogenesAnalytics commented 9 months ago

References

Some references specific to this issue.

Poetry

TensorFlow

NVIDIA

PIP