The TensorFlow Cloud repository provides APIs that will allow to easily go from debugging and training your Keras and TensorFlow code in a local environment to distributed training in the cloud.
Kaggle Notebooks supports integrations with several Google Cloud services by authenticating the user via OAuth and providing a credentials object (for example https://www.kaggle.com/product-feedback/163416).
We'd like to provide support for tensorflow_cloud on Kaggle using user's existing auth via this mechanism. One way to do that would be allowing the library to have a credential object passed in on initialization (like other Google Cloud client libraries).
Kaggle Notebooks supports integrations with several Google Cloud services by authenticating the user via OAuth and providing a credentials object (for example https://www.kaggle.com/product-feedback/163416).
We'd like to provide support for tensorflow_cloud on Kaggle using user's existing auth via this mechanism. One way to do that would be allowing the library to have a credential object passed in on initialization (like other Google Cloud client libraries).
Another approach, which I've sort of hacked together below[1] is to take the credential object and write it to the filesystem as the GOOGLE_APPLICATION_CREDENTIALS file. This actually works and is implemented by the google_auth library but it always returns a None project id (https://github.com/googleapis/google-auth-library-python/blob/9058f1fea680613d9717a62ee37dc294c11b9c8a/google/auth/_default.py#L126) so tensorflow_cloud throws an error: https://github.com/tensorflow/cloud/blob/dd5c18856fcae4948f638392488e94adce90577f/tensorflow_cloud/gcp.py#L27. Would it be possible to support this approach, but have another mechanism to pass in the project id, in a constructor for example?
Is there any constraint that disallows using user credentials (and requires service accounts to be used)?
[1]