google / automl

Google Brain AutoML
Apache License 2.0
6.25k stars 1.45k forks source link

All attempts to get a Google authentication bearer token failed, returning an empty token. #1099

Closed johnny12150 closed 3 years ago

johnny12150 commented 3 years ago

I also tried the tfhub.ipynb on my Ubuntu20.04 machines and still can't get the pretrained layer from tf-hub thru google storage. The error is:

2021-10-06 00:57:28.204081: W tensorflow/core/platform/cloud/google_auth_provider.cc:184] All attempts to get a Google authentication bearer token failed, returning an empty token. Retrieving token from files failed with "Not found: Could not locate the credentials file.". Retrieving token from GCE failed with "Failed precondition: Error executing an HTTP request: libcurl code 6 meaning 'Couldn't resolve host name', error details: Couldn't resolve host 'metadata'".
Traceback (most recent call last):
  File "/home/wade/Documents/wade/openml/custom_data.py", line 51, in <module>
    hub.KerasLayer(hub_url, trainable=do_fine_tuning),
  File "/home/wade/anaconda3/envs/tf2/lib/python3.9/site-packages/tensorflow_hub/keras_layer.py", line 153, in __init__
    self._func = load_module(handle, tags, self._load_options)
  File "/home/wade/anaconda3/envs/tf2/lib/python3.9/site-packages/tensorflow_hub/keras_layer.py", line 449, in load_module
    return module_v2.load(handle, tags=tags, options=set_load_options)
  File "/home/wade/anaconda3/envs/tf2/lib/python3.9/site-packages/tensorflow_hub/module_v2.py", line 92, in load
    module_path = resolve(handle)
  File "/home/wade/anaconda3/envs/tf2/lib/python3.9/site-packages/tensorflow_hub/module_v2.py", line 47, in resolve
    return registry.resolver(handle)
  File "/home/wade/anaconda3/envs/tf2/lib/python3.9/site-packages/tensorflow_hub/registry.py", line 51, in __call__
    return impl(*args, **kwargs)
  File "/home/wade/anaconda3/envs/tf2/lib/python3.9/site-packages/tensorflow_hub/resolver.py", line 495, in __call__
    if not tf.compat.v1.gfile.Exists(handle):
  File "/home/wade/anaconda3/envs/tf2/lib/python3.9/site-packages/tensorflow/python/lib/io/file_io.py", line 250, in file_exists
    return file_exists_v2(filename)
  File "/home/wade/anaconda3/envs/tf2/lib/python3.9/site-packages/tensorflow/python/lib/io/file_io.py", line 268, in file_exists_v2
    _pywrap_file_io.FileExists(compat.path_to_bytes(path))
tensorflow.python.framework.errors_impl.AbortedError: All 10 retry attempts failed. The last failure: Unavailable: Error executing an HTTP request: libcurl code 60 meaning 'SSL peer certificate or SSH remote key was not OK', error details: SSL certificate problem: unable to get local issuer certificate
         when reading metadata of gs://cloud-tpu-checkpoints/efficientnet/v2/hub/efficientnetv2-b0/feature-vector

Are there any authentications required to use on local machines but not on colab?

fsx950223 commented 3 years ago

https://cloud.google.com/docs/authentication/getting-started

johnny12150 commented 3 years ago

https://cloud.google.com/docs/authentication/getting-started

Since I haven't created any authentication key on cloab, I thought this google cloud storage is public and didn't require an authentication key, right?

fsx950223 commented 3 years ago

No, it is wrong.

fsx950223 commented 3 years ago

Another solution is set all try_gcs=False

johnny12150 commented 3 years ago

No, it is wrong.

It's a little bit odd that it works on another Linux server without adding an authentication key.

johnny12150 commented 3 years ago

Another solution is set all try_gcs=False

Tks! This indeed works.