aws / sagemaker-tensorflow-serving-container

A TensorFlow Serving solution for use in SageMaker. This repo is now deprecated.
Apache License 2.0
174 stars 101 forks source link

sagemaker notebook instance Elastic Inference tensorflow model local deployment #142

Open pankajxyz opened 4 years ago

pankajxyz commented 4 years ago

I am trying to replicate https://github.com/awslabs/amazon-sagemaker-examples/blob/master/sagemaker-python-sdk/tensorflow_serving_using_elastic_inference_with_your_own_model/tensorflow_serving_pretrained_model_elastic_inference.ipynb

My elastic inference accelerator is attached to notebook instance. I am using conda_amazonei_tensorflow_p36 kernel. According to documentation I made the changes for local EI:

%%time
import boto3

region = boto3.Session().region_name
saved_model = 's3://sagemaker-sample-data-{}/tensorflow/model/resnet/resnet_50_v2_fp32_NCHW.tar.gz'.format(region)

import sagemaker
from sagemaker.tensorflow.serving import Model

role = sagemaker.get_execution_role()

tensorflow_model = Model(model_data=saved_model,
                         role=role,
                         framework_version='1.14')
tf_predictor = tensorflow_model.deploy(initial_instance_count=1,
                               instance_type='local',  
                               accelerator_type='local_sagemaker_notebook')

I am getting following log in the notebook:

Attaching to tmp6uqys1el_algo-1-7ynb1_1
algo-1-7ynb1_1  | INFO:__main__:starting services
algo-1-7ynb1_1  | INFO:__main__:using default model name: Servo
algo-1-7ynb1_1  | INFO:__main__:tensorflow serving model config: 
algo-1-7ynb1_1  | model_config_list: {
algo-1-7ynb1_1  |   config: {
algo-1-7ynb1_1  |     name: "Servo",
algo-1-7ynb1_1  |     base_path: "/opt/ml/model/export/Servo",
algo-1-7ynb1_1  |     model_platform: "tensorflow"
algo-1-7ynb1_1  |   }
algo-1-7ynb1_1  | }
algo-1-7ynb1_1  | 
algo-1-7ynb1_1  | 
algo-1-7ynb1_1  | INFO:__main__:nginx config: 
algo-1-7ynb1_1  | load_module modules/ngx_http_js_module.so;
algo-1-7ynb1_1  | 
algo-1-7ynb1_1  | worker_processes auto;
algo-1-7ynb1_1  | daemon off;
algo-1-7ynb1_1  | pid /tmp/nginx.pid;
algo-1-7ynb1_1  | error_log  /dev/stderr error;
algo-1-7ynb1_1  | 
algo-1-7ynb1_1  | worker_rlimit_nofile 4096;
algo-1-7ynb1_1  | 
algo-1-7ynb1_1  | events {
algo-1-7ynb1_1  |   worker_connections 2048;
algo-1-7ynb1_1  | }
algo-1-7ynb1_1  | 
algo-1-7ynb1_1  | http {
algo-1-7ynb1_1  |   include /etc/nginx/mime.types;
algo-1-7ynb1_1  |   default_type application/json;
algo-1-7ynb1_1  |   access_log /dev/stdout combined;
algo-1-7ynb1_1  |   js_include tensorflow-serving.js;
algo-1-7ynb1_1  | 
algo-1-7ynb1_1  |   upstream tfs_upstream {
algo-1-7ynb1_1  |     server localhost:8501;
algo-1-7ynb1_1  |   }
algo-1-7ynb1_1  | 
algo-1-7ynb1_1  |   upstream gunicorn_upstream {
algo-1-7ynb1_1  |     server unix:/tmp/gunicorn.sock fail_timeout=1;
algo-1-7ynb1_1  |   }
algo-1-7ynb1_1  | 
algo-1-7ynb1_1  |   server {
algo-1-7ynb1_1  |     listen 8080 deferred;
algo-1-7ynb1_1  |     client_max_body_size 0;
algo-1-7ynb1_1  |     client_body_buffer_size 100m;
algo-1-7ynb1_1  |     subrequest_output_buffer_size 100m;
algo-1-7ynb1_1  | 
algo-1-7ynb1_1  |     set $tfs_version 1.14;
algo-1-7ynb1_1  |     set $default_tfs_model Servo;
algo-1-7ynb1_1  | 
algo-1-7ynb1_1  |     location /tfs {
algo-1-7ynb1_1  |         rewrite ^/tfs/(.*) /$1  break;
algo-1-7ynb1_1  |         proxy_redirect off;
algo-1-7ynb1_1  |         proxy_pass_request_headers off;
algo-1-7ynb1_1  |         proxy_set_header Content-Type 'application/json';
algo-1-7ynb1_1  |         proxy_set_header Accept 'application/json';
algo-1-7ynb1_1  |         proxy_pass http://tfs_upstream;
algo-1-7ynb1_1  |     }
algo-1-7ynb1_1  | 
algo-1-7ynb1_1  |     location /ping {
algo-1-7ynb1_1  |         js_content ping;
algo-1-7ynb1_1  |     }
algo-1-7ynb1_1  | 
algo-1-7ynb1_1  |     location /invocations {
algo-1-7ynb1_1  |         js_content invocations;
algo-1-7ynb1_1  |     }
algo-1-7ynb1_1  | 
algo-1-7ynb1_1  |     location ~ ^/models/(.*)/invoke {
algo-1-7ynb1_1  |         js_content invocations;
algo-1-7ynb1_1  |     }
algo-1-7ynb1_1  | 
algo-1-7ynb1_1  |     location /models {
algo-1-7ynb1_1  |         proxy_pass http://gunicorn_upstream/models;
algo-1-7ynb1_1  |     }
algo-1-7ynb1_1  | 
algo-1-7ynb1_1  |     location / {
algo-1-7ynb1_1  |         return 404 '{"error": "Not Found"}';
algo-1-7ynb1_1  |     }
algo-1-7ynb1_1  | 
algo-1-7ynb1_1  |     keepalive_timeout 3;
algo-1-7ynb1_1  |   }
algo-1-7ynb1_1  | }
algo-1-7ynb1_1  | 
algo-1-7ynb1_1  | 
algo-1-7ynb1_1  | INFO:__main__:tensorflow version info:
algo-1-7ynb1_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-7ynb1_1  | TensorFlow Library: 1.14.0
algo-1-7ynb1_1  | EI Version: EI-1.4
algo-1-7ynb1_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-7ynb1_1  | INFO:__main__:started tensorflow serving (pid: 8)
algo-1-7ynb1_1  | INFO:__main__:nginx version info:
algo-1-7ynb1_1  | nginx version: nginx/1.16.1
algo-1-7ynb1_1  | built by gcc 5.4.0 20160609 (Ubuntu 5.4.0-6ubuntu1~16.04.11) 
algo-1-7ynb1_1  | built with OpenSSL 1.0.2g  1 Mar 2016
algo-1-7ynb1_1  | TLS SNI support enabled
algo-1-7ynb1_1  | configure arguments: --prefix=/etc/nginx --sbin-path=/usr/sbin/nginx --modules-path=/usr/lib/nginx/modules --conf-path=/etc/nginx/nginx.conf --error-log-path=/var/log/nginx/error.log --http-log-path=/var/log/nginx/access.log --pid-path=/var/run/nginx.pid --lock-path=/var/run/nginx.lock --http-client-body-temp-path=/var/cache/nginx/client_temp --http-proxy-temp-path=/var/cache/nginx/proxy_temp --http-fastcgi-temp-path=/var/cache/nginx/fastcgi_temp --http-uwsgi-temp-path=/var/cache/nginx/uwsgi_temp --http-scgi-temp-path=/var/cache/nginx/scgi_temp --user=nginx --group=nginx --with-compat --with-file-aio --with-threads --with-http_addition_module --with-http_auth_request_module --with-http_dav_module --with-http_flv_module --with-http_gunzip_module --with-http_gzip_static_module --with-http_mp4_module --with-http_random_index_module --with-http_realip_module --with-http_secure_link_module --with-http_slice_module --with-http_ssl_module --with-http_stub_status_module --with-http_sub_module --with-http_v2_module --with-mail --with-mail_ssl_module --with-stream --with-stream_realip_module --with-stream_ssl_module --with-stream_ssl_preread_module --with-cc-opt='-g -O2 -fPIE -fstack-protector-strong -Wformat -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -fPIC' --with-ld-opt='-Wl,-Bsymbolic-functions -fPIE -pie -Wl,-z,relro -Wl,-z,now -Wl,--as-needed -pie'
algo-1-7ynb1_1  | INFO:__main__:started nginx (pid: 10)
algo-1-7ynb1_1  | 2020-06-17 05:02:08.888114: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-7ynb1_1  | 2020-06-17 05:02:08.888186: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-7ynb1_1  | 2020-06-17 05:02:08.988623: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-7ynb1_1  | 2020-06-17 05:02:08.988688: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-7ynb1_1  | 2020-06-17 05:02:08.988728: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-7ynb1_1  | 2020-06-17 05:02:08.988762: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-7ynb1_1  | 2020-06-17 05:02:08.988783: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-7ynb1_1  | 2020-06-17 05:02:09.001922: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-7ynb1_1  | 2020-06-17 05:02:09.082734: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-7ynb1_1  | 2020-06-17 05:02:09.613725: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-7ynb1_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-7ynb1_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-7ynb1_1  | Elastic Inference Accelerator ID: eia-813285f77ceb448c849e2331116f251b
algo-1-7ynb1_1  | Elastic Inference Accelerator Type: eia2.medium
algo-1-7ynb1_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-7ynb1_1  | 
!algo-1-7ynb1_1  | 172.18.0.1 - - [17/Jun/2020:05:02:10 +0000] "GET /ping HTTP/1.1" 200 3 "-" "-"
algo-1-7ynb1_1  | [Wed Jun 17 05:02:11 2020, 662569us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-7ynb1_1  | [Wed Jun 17 05:02:11 2020, 662722us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-7ynb1_1  |     EI Error Code: [3, 16, 8]
algo-1-7ynb1_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-7ynb1_1  |     EI Request ID: TF-D66B9810-D81A-448F-ACE2-703FFFA0F194  --  EI Accelerator ID: eia-813285f77ceb448c849e2331116f251b
algo-1-7ynb1_1  |     EI Client Version: 1.5.3
algo-1-7ynb1_1  | 2020-06-17 05:02:11.668412: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-7ynb1_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-7ynb1_1  | INFO:__main__:tensorflow version info:
algo-1-7ynb1_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-7ynb1_1  | TensorFlow Library: 1.14.0
algo-1-7ynb1_1  | EI Version: EI-1.4
algo-1-7ynb1_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-7ynb1_1  | INFO:__main__:started tensorflow serving (pid: 38)
algo-1-7ynb1_1  | 2020-06-17 05:02:11.759706: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-7ynb1_1  | 2020-06-17 05:02:11.759783: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-7ynb1_1  | 2020-06-17 05:02:11.860242: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-7ynb1_1  | 2020-06-17 05:02:11.860309: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-7ynb1_1  | 2020-06-17 05:02:11.860333: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-7ynb1_1  | 2020-06-17 05:02:11.860365: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-7ynb1_1  | 2020-06-17 05:02:11.860382: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-7ynb1_1  | 2020-06-17 05:02:11.873381: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-7ynb1_1  | 2020-06-17 05:02:11.949421: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-7ynb1_1  | 2020-06-17 05:02:12.512935: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-7ynb1_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-7ynb1_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-7ynb1_1  | Elastic Inference Accelerator ID: eia-813285f77ceb448c849e2331116f251b
algo-1-7ynb1_1  | Elastic Inference Accelerator Type: eia2.medium
algo-1-7ynb1_1  | Elastic Inference Accelerator Ordinal: 0

The log never stops in notebook. It keeps throwing in notebook cells. I am not sure whether the model is deployed correctly.

I can see the docker of the model running Screenshot 2020-06-17 at 1 20 16 PM

When I try to infer/predict from that model, I get error:

algo-1-iikpj_1  | [Wed Jun 17 05:29:47 2020, 761607us] [Execution Engine] Error getting application context for [TensorFlow][2]

algo-1-iikpj_1  | [Wed Jun 17 05:29:47 2020, 761691us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-iikpj_1  |     EI Error Code: [3, 16, 8]
algo-1-iikpj_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-iikpj_1  |     EI Request ID: TF-ADECD8EF-7138-4B5F-9C37-ADFDC8122DF1  --  EI Accelerator ID: eia-813285f77ceb448c849e2331116f251b
algo-1-iikpj_1  |     EI Client Version: 1.5.3
algo-1-iikpj_1  | 2020-06-17 05:29:47.768249: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-iikpj_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-iikpj_1  | INFO:__main__:tensorflow version info:
algo-1-iikpj_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-iikpj_1  | TensorFlow Library: 1.14.0
algo-1-iikpj_1  | EI Version: EI-1.4
algo-1-iikpj_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-iikpj_1  | INFO:__main__:started tensorflow serving (pid: 1052)
algo-1-iikpj_1  | 2020-06-17 05:29:47.854331: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-iikpj_1  | 2020-06-17 05:29:47.854405: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-iikpj_1  | 2020/06/17 05:29:47 [error] 11#11: *2 connect() failed (111: Connection refused) while connecting to upstream, client: 172.18.0.1, server: , request: "POST /invocations HTTP/1.1", subrequest: "/v1/models/Servo:predict", upstream: "http://127.0.0.1:8501/v1/models/Servo:predict", host: "localhost:8080"
algo-1-iikpj_1  | 2020/06/17 05:29:47 [error] 11#11: *2 connect() failed (111: Connection refused) while connecting to upstream, client: 172.18.0.1, server: , request: "POST /invocations HTTP/1.1", subrequest: "/v1/models/Servo:predict", upstream: "http://127.0.0.1:8501/v1/models/Servo:predict", host: "localhost:8080"
algo-1-iikpj_1  | 172.18.0.1 - - [17/Jun/2020:05:29:47 +0000] "POST /invocations HTTP/1.1" 502 157 "-" "-"
algo-1-iikpj_1  | 2020-06-17 05:29:47.954825: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-iikpj_1  | 2020-06-17 05:29:47.954887: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-iikpj_1  | 2020-06-17 05:29:47.955448: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-iikpj_1  | 2020-06-17 05:29:47.955494: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-iikpj_1  | 2020-06-17 05:29:47.955859: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-iikpj_1  | 2020-06-17 05:29:47.969511: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
---------------------------------------------------------------------------
JSONDecodeError                           Traceback (most recent call last)
<timed exec> in <module>()

~/anaconda3/envs/amazonei_tensorflow_p36/lib/python3.6/site-packages/sagemaker/tensorflow/serving.py in predict(self, data, initial_args)
    116                 args["CustomAttributes"] = self._model_attributes
    117 
--> 118         return super(Predictor, self).predict(data, args)
    119 
    120 

~/anaconda3/envs/amazonei_tensorflow_p36/lib/python3.6/site-packages/sagemaker/predictor.py in predict(self, data, initial_args, target_model)
    109         request_args = self._create_request_args(data, initial_args, target_model)
    110         response = self.sagemaker_session.sagemaker_runtime_client.invoke_endpoint(**request_args)
--> 111         return self._handle_response(response)
    112 
    113     def _handle_response(self, response):

~/anaconda3/envs/amazonei_tensorflow_p36/lib/python3.6/site-packages/sagemaker/predictor.py in _handle_response(self, response)
    119         if self.deserializer is not None:
    120             # It's the deserializer's responsibility to close the stream
--> 121             return self.deserializer(response_body, response["ContentType"])
    122         data = response_body.read()
    123         response_body.close()

~/anaconda3/envs/amazonei_tensorflow_p36/lib/python3.6/site-packages/sagemaker/predictor.py in __call__(self, stream, content_type)
    578         """
    579         try:
--> 580             return json.load(codecs.getreader("utf-8")(stream))
    581         finally:
    582             stream.close()

~/anaconda3/envs/amazonei_tensorflow_p36/lib/python3.6/json/__init__.py in load(fp, cls, object_hook, parse_float, parse_int, parse_constant, object_pairs_hook, **kw)
    297         cls=cls, object_hook=object_hook,
    298         parse_float=parse_float, parse_int=parse_int,
--> 299         parse_constant=parse_constant, object_pairs_hook=object_pairs_hook, **kw)
    300 
    301 

~/anaconda3/envs/amazonei_tensorflow_p36/lib/python3.6/json/__init__.py in loads(s, encoding, cls, object_hook, parse_float, parse_int, parse_constant, object_pairs_hook, **kw)
    352             parse_int is None and parse_float is None and
    353             parse_constant is None and object_pairs_hook is None and not kw):
--> 354         return _default_decoder.decode(s)
    355     if cls is None:
    356         cls = JSONDecoder

~/anaconda3/envs/amazonei_tensorflow_p36/lib/python3.6/json/decoder.py in decode(self, s, _w)
    337 
    338         """
--> 339         obj, end = self.raw_decode(s, idx=_w(s, 0).end())
    340         end = _w(s, end).end()
    341         if end != len(s):

~/anaconda3/envs/amazonei_tensorflow_p36/lib/python3.6/json/decoder.py in raw_decode(self, s, idx)
    355             obj, end = self.scan_once(s, idx)
    356         except StopIteration as err:
--> 357             raise JSONDecodeError("Expecting value", s, err.value) from None
    358         return obj, end

JSONDecodeError: Expecting value: line 1 column 1 (char 0)

algo-1-iikpj_1  | 2020-06-17 05:29:48.047106: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-iikpj_1  | 2020-06-17 05:29:48.564452: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-iikpj_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3

I tried several ways to solve JSONDecodeError: Expecting value: line 1 column 1 (char 0) using json.loads, json.dumps etc but nothing helps. I also tried Rest API post to docker deployed model:

curl -v -X POST \
    -H 'content-type:application/json' \
    -d '{"data": {"inputs": [[[[0.13075708159043742, 0.048010725848070535, 0.9012465727287071], [0.1643217202482622, 0.7392467524276859, 0.5618572640643519], [0.7697097217983989, 0.9829998452540657, 0.08567413146192027]]]]} }' \
    http://127.0.0.1:8080/v1/models/Servo:predict

but still getting error: Screenshot 2020-06-17 at 1 36 15 PM

Please help me to resolve the issue. Initially, I was trying to use my tensorflow serving model and getting the same errors. Then I thought of following with the same model which was used in your example notebook (resnet_50_v2_fp32_NCHW.tar.gz'). So, the above experiment is using your example notbook with model provided by sagemaker-sample-data.

Please help me out. Thanks

laurenyu commented 4 years ago

I created a ml.c5.xlarge + ml.eia1.medium Notebook Instance, opened the example notebook you linked to, and chose the conda_amazonei_tensorflow_p36 kernel.

I then changed the fourth code cell to:

predictor = tensorflow_model.deploy(initial_instance_count=1,
                                    instance_type='local',
                                    accelerator_type='local_sagemaker_notebook')

and the notebook ran successfully for me.

Do you have any more info that could help with reproducing the issue?

pankajxyz commented 4 years ago

I was using ml.t2.medium + ml.ea2.medium + conda_amazonei_tensorflow_p36 kernel. But that should not make any difference. Now, I even tried with ml.c5.xlarge + ml.eia1.medium. This time I open jupyter lab instead of jupyter notebook. The model.deploy error log were exactly same. In fact it kept on throwing so much error that first it became hard to scroll in notebook.

It's very strange it worked for you so smoothly. Can you share your notebook with output cells? Thanks Screenshot 2020-06-18 at 11 25 34 AM Screenshot 2020-06-18 at 11 27 20 AM

laurenyu commented 4 years ago

Can you share your notebook with output cells?

image image image image logs from that cell:

Attaching to tmpfj17qpp0_algo-1-hvkph_1
algo-1-hvkph_1  | INFO:__main__:starting services
algo-1-hvkph_1  | INFO:__main__:using default model name: Servo
algo-1-hvkph_1  | INFO:__main__:tensorflow serving model config: 
algo-1-hvkph_1  | model_config_list: {
algo-1-hvkph_1  |   config: {
algo-1-hvkph_1  |     name: "Servo",
algo-1-hvkph_1  |     base_path: "/opt/ml/model/export/Servo",
algo-1-hvkph_1  |     model_platform: "tensorflow"
algo-1-hvkph_1  |   }
algo-1-hvkph_1  | }
algo-1-hvkph_1  | 
algo-1-hvkph_1  | 
algo-1-hvkph_1  | INFO:__main__:nginx config: 
algo-1-hvkph_1  | load_module modules/ngx_http_js_module.so;
algo-1-hvkph_1  | 
algo-1-hvkph_1  | worker_processes auto;
algo-1-hvkph_1  | daemon off;
algo-1-hvkph_1  | pid /tmp/nginx.pid;
algo-1-hvkph_1  | error_log  /dev/stderr error;
algo-1-hvkph_1  | 
algo-1-hvkph_1  | worker_rlimit_nofile 4096;
algo-1-hvkph_1  | 
algo-1-hvkph_1  | events {
algo-1-hvkph_1  |   worker_connections 2048;
algo-1-hvkph_1  | }
algo-1-hvkph_1  | 
algo-1-hvkph_1  | http {
algo-1-hvkph_1  |   include /etc/nginx/mime.types;
algo-1-hvkph_1  |   default_type application/json;
algo-1-hvkph_1  |   access_log /dev/stdout combined;
algo-1-hvkph_1  |   js_include tensorflow-serving.js;
algo-1-hvkph_1  | 
algo-1-hvkph_1  |   upstream tfs_upstream {
algo-1-hvkph_1  |     server localhost:8501;
algo-1-hvkph_1  |   }
algo-1-hvkph_1  | 
algo-1-hvkph_1  |   upstream gunicorn_upstream {
algo-1-hvkph_1  |     server unix:/tmp/gunicorn.sock fail_timeout=1;
algo-1-hvkph_1  |   }
algo-1-hvkph_1  | 
algo-1-hvkph_1  |   server {
algo-1-hvkph_1  |     listen 8080 deferred;
algo-1-hvkph_1  |     client_max_body_size 0;
algo-1-hvkph_1  |     client_body_buffer_size 100m;
algo-1-hvkph_1  |     subrequest_output_buffer_size 100m;
algo-1-hvkph_1  | 
algo-1-hvkph_1  |     set $tfs_version 1.14;
algo-1-hvkph_1  |     set $default_tfs_model Servo;
algo-1-hvkph_1  | 
algo-1-hvkph_1  |     location /tfs {
algo-1-hvkph_1  |         rewrite ^/tfs/(.*) /$1  break;
algo-1-hvkph_1  |         proxy_redirect off;
algo-1-hvkph_1  |         proxy_pass_request_headers off;
algo-1-hvkph_1  |         proxy_set_header Content-Type 'application/json';
algo-1-hvkph_1  |         proxy_set_header Accept 'application/json';
algo-1-hvkph_1  |         proxy_pass http://tfs_upstream;
algo-1-hvkph_1  |     }
algo-1-hvkph_1  | 
algo-1-hvkph_1  |     location /ping {
algo-1-hvkph_1  |         js_content ping;
algo-1-hvkph_1  |     }
algo-1-hvkph_1  | 
algo-1-hvkph_1  |     location /invocations {
algo-1-hvkph_1  |         js_content invocations;
algo-1-hvkph_1  |     }
algo-1-hvkph_1  | 
algo-1-hvkph_1  |     location ~ ^/models/(.*)/invoke {
algo-1-hvkph_1  |         js_content invocations;
algo-1-hvkph_1  |     }
algo-1-hvkph_1  | 
algo-1-hvkph_1  |     location /models {
algo-1-hvkph_1  |         proxy_pass http://gunicorn_upstream/models;
algo-1-hvkph_1  |     }
algo-1-hvkph_1  | 
algo-1-hvkph_1  |     location / {
algo-1-hvkph_1  |         return 404 '{"error": "Not Found"}';
algo-1-hvkph_1  |     }
algo-1-hvkph_1  | 
algo-1-hvkph_1  |     keepalive_timeout 3;
algo-1-hvkph_1  |   }
algo-1-hvkph_1  | }
algo-1-hvkph_1  | 
algo-1-hvkph_1  | 
algo-1-hvkph_1  | INFO:__main__:tensorflow version info:
algo-1-hvkph_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-hvkph_1  | TensorFlow Library: 1.14.0
algo-1-hvkph_1  | EI Version: EI-1.4
algo-1-hvkph_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-hvkph_1  | INFO:__main__:started tensorflow serving (pid: 8)
algo-1-hvkph_1  | INFO:__main__:nginx version info:
algo-1-hvkph_1  | nginx version: nginx/1.16.1
algo-1-hvkph_1  | built by gcc 5.4.0 20160609 (Ubuntu 5.4.0-6ubuntu1~16.04.11) 
algo-1-hvkph_1  | built with OpenSSL 1.0.2g  1 Mar 2016
algo-1-hvkph_1  | TLS SNI support enabled
algo-1-hvkph_1  | configure arguments: --prefix=/etc/nginx --sbin-path=/usr/sbin/nginx --modules-path=/usr/lib/nginx/modules --conf-path=/etc/nginx/nginx.conf --error-log-path=/var/log/nginx/error.log --http-log-path=/var/log/nginx/access.log --pid-path=/var/run/nginx.pid --lock-path=/var/run/nginx.lock --http-client-body-temp-path=/var/cache/nginx/client_temp --http-proxy-temp-path=/var/cache/nginx/proxy_temp --http-fastcgi-temp-path=/var/cache/nginx/fastcgi_temp --http-uwsgi-temp-path=/var/cache/nginx/uwsgi_temp --http-scgi-temp-path=/var/cache/nginx/scgi_temp --user=nginx --group=nginx --with-compat --with-file-aio --with-threads --with-http_addition_module --with-http_auth_request_module --with-http_dav_module --with-http_flv_module --with-http_gunzip_module --with-http_gzip_static_module --with-http_mp4_module --with-http_random_index_module --with-http_realip_module --with-http_secure_link_module --with-http_slice_module --with-http_ssl_module --with-http_stub_status_module --with-http_sub_module --with-http_v2_module --with-mail --with-mail_ssl_module --with-stream --with-stream_realip_module --with-stream_ssl_module --with-stream_ssl_preread_module --with-cc-opt='-g -O2 -fPIE -fstack-protector-strong -Wformat -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -fPIC' --with-ld-opt='-Wl,-Bsymbolic-functions -fPIE -pie -Wl,-z,relro -Wl,-z,now -Wl,--as-needed -pie'
algo-1-hvkph_1  | INFO:__main__:started nginx (pid: 10)
algo-1-hvkph_1  | 2020-06-17 20:52:03.345429: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-hvkph_1  | 2020-06-17 20:52:03.345474: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-hvkph_1  | 2020-06-17 20:52:03.445748: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-hvkph_1  | 2020-06-17 20:52:03.445788: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-hvkph_1  | 2020-06-17 20:52:03.445803: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-hvkph_1  | 2020-06-17 20:52:03.445834: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-hvkph_1  | 2020-06-17 20:52:03.445848: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-hvkph_1  | 2020-06-17 20:52:03.455352: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-hvkph_1  | 2020-06-17 20:52:03.466221: I external/org_tensorflow/tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX512F
algo-1-hvkph_1  | 2020-06-17 20:52:03.510498: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-hvkph_1  | 2020-06-17 20:52:03.893187: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-hvkph_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-hvkph_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-hvkph_1  | Elastic Inference Accelerator ID: eia-b24ac366a9d9424bb2dbf23baaf3a786
algo-1-hvkph_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-hvkph_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-hvkph_1  | 
algo-1-hvkph_1  | 2020-06-17 20:52:04.032571: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:311] SavedModel load for tags { serve }; Status: success. Took 586714 microseconds.
algo-1-hvkph_1  | 2020-06-17 20:52:04.032647: I tensorflow_serving/servables/tensorflow/saved_model_warmup.cc:103] No warmup data file found at /opt/ml/model/export/Servo/1527887769/assets.extra/tf_serving_warmup_requests
algo-1-hvkph_1  | 2020-06-17 20:52:04.032819: I tensorflow_serving/core/loader_harness.cc:86] Successfully loaded servable version {name: Servo version: 1527887769}
algo-1-hvkph_1  | 2020-06-17 20:52:04.034364: I tensorflow_serving/model_servers/server.cc:328] Running gRPC ModelServer at 0.0.0.0:9000 ...
algo-1-hvkph_1  | [warn] getaddrinfo: address family for nodename not supported
algo-1-hvkph_1  | 2020-06-17 20:52:04.035147: I tensorflow_serving/model_servers/server.cc:348] Exporting HTTP/REST API at:localhost:8501 ...
algo-1-hvkph_1  | [evhttp_server.cc : 239] RAW: Entering the event loop ...

!algo-1-hvkph_1  | 172.18.0.1 - - [17/Jun/2020:20:52:05 +0000] "GET /ping HTTP/1.1" 200 3 "-" "-"
CPU times: user 1.19 s, sys: 343 ms, total: 1.53 s
Wall time: 6.28 s

image image

pankajxyz commented 4 years ago

image image

Attaching to tmpw0h9l0hn_algo-1-4gcx7_1
algo-1-4gcx7_1  | INFO:__main__:starting services
algo-1-4gcx7_1  | INFO:__main__:using default model name: Servo
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving model config: 
algo-1-4gcx7_1  | model_config_list: {
algo-1-4gcx7_1  |   config: {
algo-1-4gcx7_1  |     name: "Servo",
algo-1-4gcx7_1  |     base_path: "/opt/ml/model/export/Servo",
algo-1-4gcx7_1  |     model_platform: "tensorflow"
algo-1-4gcx7_1  |   }
algo-1-4gcx7_1  | }
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | INFO:__main__:nginx config: 
algo-1-4gcx7_1  | load_module modules/ngx_http_js_module.so;
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | worker_processes auto;
algo-1-4gcx7_1  | daemon off;
algo-1-4gcx7_1  | pid /tmp/nginx.pid;
algo-1-4gcx7_1  | error_log  /dev/stderr error;
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | worker_rlimit_nofile 4096;
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | events {
algo-1-4gcx7_1  |   worker_connections 2048;
algo-1-4gcx7_1  | }
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | http {
algo-1-4gcx7_1  |   include /etc/nginx/mime.types;
algo-1-4gcx7_1  |   default_type application/json;
algo-1-4gcx7_1  |   access_log /dev/stdout combined;
algo-1-4gcx7_1  |   js_include tensorflow-serving.js;
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  |   upstream tfs_upstream {
algo-1-4gcx7_1  |     server localhost:8501;
algo-1-4gcx7_1  |   }
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  |   upstream gunicorn_upstream {
algo-1-4gcx7_1  |     server unix:/tmp/gunicorn.sock fail_timeout=1;
algo-1-4gcx7_1  |   }
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  |   server {
algo-1-4gcx7_1  |     listen 8080 deferred;
algo-1-4gcx7_1  |     client_max_body_size 0;
algo-1-4gcx7_1  |     client_body_buffer_size 100m;
algo-1-4gcx7_1  |     subrequest_output_buffer_size 100m;
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  |     set $tfs_version 1.14;
algo-1-4gcx7_1  |     set $default_tfs_model Servo;
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  |     location /tfs {
algo-1-4gcx7_1  |         rewrite ^/tfs/(.*) /$1  break;
algo-1-4gcx7_1  |         proxy_redirect off;
algo-1-4gcx7_1  |         proxy_pass_request_headers off;
algo-1-4gcx7_1  |         proxy_set_header Content-Type 'application/json';
algo-1-4gcx7_1  |         proxy_set_header Accept 'application/json';
algo-1-4gcx7_1  |         proxy_pass http://tfs_upstream;
algo-1-4gcx7_1  |     }
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  |     location /ping {
algo-1-4gcx7_1  |         js_content ping;
algo-1-4gcx7_1  |     }
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  |     location /invocations {
algo-1-4gcx7_1  |         js_content invocations;
algo-1-4gcx7_1  |     }
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  |     location ~ ^/models/(.*)/invoke {
algo-1-4gcx7_1  |         js_content invocations;
algo-1-4gcx7_1  |     }
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  |     location /models {
algo-1-4gcx7_1  |         proxy_pass http://gunicorn_upstream/models;
algo-1-4gcx7_1  |     }
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  |     location / {
algo-1-4gcx7_1  |         return 404 '{"error": "Not Found"}';
algo-1-4gcx7_1  |     }
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  |     keepalive_timeout 3;
algo-1-4gcx7_1  |   }
algo-1-4gcx7_1  | }
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 8)
algo-1-4gcx7_1  | INFO:__main__:nginx version info:
algo-1-4gcx7_1  | nginx version: nginx/1.16.1
algo-1-4gcx7_1  | built by gcc 5.4.0 20160609 (Ubuntu 5.4.0-6ubuntu1~16.04.11) 
algo-1-4gcx7_1  | built with OpenSSL 1.0.2g  1 Mar 2016
algo-1-4gcx7_1  | TLS SNI support enabled
algo-1-4gcx7_1  | configure arguments: --prefix=/etc/nginx --sbin-path=/usr/sbin/nginx --modules-path=/usr/lib/nginx/modules --conf-path=/etc/nginx/nginx.conf --error-log-path=/var/log/nginx/error.log --http-log-path=/var/log/nginx/access.log --pid-path=/var/run/nginx.pid --lock-path=/var/run/nginx.lock --http-client-body-temp-path=/var/cache/nginx/client_temp --http-proxy-temp-path=/var/cache/nginx/proxy_temp --http-fastcgi-temp-path=/var/cache/nginx/fastcgi_temp --http-uwsgi-temp-path=/var/cache/nginx/uwsgi_temp --http-scgi-temp-path=/var/cache/nginx/scgi_temp --user=nginx --group=nginx --with-compat --with-file-aio --with-threads --with-http_addition_module --with-http_auth_request_module --with-http_dav_module --with-http_flv_module --with-http_gunzip_module --with-http_gzip_static_module --with-http_mp4_module --with-http_random_index_module --with-http_realip_module --with-http_secure_link_module --with-http_slice_module --with-http_ssl_module --with-http_stub_status_module --with-http_sub_module --with-http_v2_module --with-mail --with-mail_ssl_module --with-stream --with-stream_realip_module --with-stream_ssl_module --with-stream_ssl_preread_module --with-cc-opt='-g -O2 -fPIE -fstack-protector-strong -Wformat -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -fPIC' --with-ld-opt='-Wl,-Bsymbolic-functions -fPIE -pie -Wl,-z,relro -Wl,-z,now -Wl,--as-needed -pie'
algo-1-4gcx7_1  | INFO:__main__:started nginx (pid: 10)
algo-1-4gcx7_1  | 2020-06-22 03:34:03.556226: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:34:03.556292: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:34:03.656756: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:03.656813: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:03.656833: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:03.657035: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:03.657059: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:03.669086: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:34:03.736786: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:34:04.211515: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:34:06 2020, 277915us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:34:06 2020, 278047us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-FD5416E4-9C1D-4B39-A9EF-DBDF1C22DBB3  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:34:06.282645: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 38)
algo-1-4gcx7_1  | 2020-06-22 03:34:06.364948: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:34:06.365014: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:34:06.465505: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:06.465560: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:06.465688: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:06.465729: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:06.465744: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:06.478010: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:34:06.543949: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:34:07.006893: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
!CPU times: user 1.51 s, sys: 330 ms, total: 1.84 s
Wall time: 36.8 s
algo-1-4gcx7_1  | 172.18.0.1 - - [22/Jun/2020:03:34:07 +0000] "GET /ping HTTP/1.1" 200 3 "-" "-"
algo-1-4gcx7_1  | [Mon Jun 22 03:34:09 2020, 072736us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:34:09 2020, 072805us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-2590AFDB-30A0-465D-BD7E-724CF339A789  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:34:09.077583: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 64)
algo-1-4gcx7_1  | 2020-06-22 03:34:09.159935: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:34:09.160000: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:34:09.260447: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:09.260502: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:09.260525: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:09.260554: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:09.260590: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:09.272478: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:34:09.336364: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:34:09.785997: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:34:11 2020, 848864us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:34:11 2020, 848928us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-AC77B6BD-17BE-4D53-9624-EFCFCC7151A3  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:34:11.853710: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 90)
algo-1-4gcx7_1  | 2020-06-22 03:34:11.935603: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:34:11.935668: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:34:12.036111: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:12.036167: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:12.036189: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:12.036224: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:12.036240: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:12.048191: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:34:12.113343: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:34:12.561220: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:34:14 2020, 624537us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:34:14 2020, 624610us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-45F6B6B2-1C36-467C-8979-4FBE7579C1CB  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:34:14.629240: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 116)
algo-1-4gcx7_1  | 2020-06-22 03:34:14.710402: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:34:14.710470: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:34:14.810919: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:14.810974: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:14.810996: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:14.811035: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:14.811061: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:14.823183: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:34:14.887747: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:34:15.341906: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:34:17 2020, 412305us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:34:17 2020, 412371us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-AB0E2A35-67C4-49F9-943A-32F3C8C3FC3C  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:34:17.417580: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 142)
algo-1-4gcx7_1  | 2020-06-22 03:34:17.499654: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:34:17.499724: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:34:17.600138: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:17.600191: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:17.600210: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:17.600243: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:17.600260: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:17.612273: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:34:17.676531: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:34:18.128825: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:34:20 2020, 190708us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:34:20 2020, 190778us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-0D0052A6-C901-4EE3-80E0-D4D1C5339E32  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:34:20.195313: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 168)
algo-1-4gcx7_1  | 2020-06-22 03:34:20.277233: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:34:20.277300: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:34:20.377725: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:20.377779: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:20.377800: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:20.377857: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:20.377873: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:20.389849: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:34:20.454957: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:34:20.905149: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:34:22 2020, 961244us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:34:22 2020, 961311us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-159069B1-499F-4E47-8923-B28D9E863898  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:34:22.965959: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 194)
algo-1-4gcx7_1  | 2020-06-22 03:34:23.047548: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:34:23.047612: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:34:23.148071: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:23.148124: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:23.148146: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:23.148177: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:23.148194: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:23.160043: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:34:23.223842: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:34:23.673161: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:34:25 2020, 729574us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:34:25 2020, 729640us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-0F703BE8-4689-483C-8518-0EB5302AF8D1  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:34:25.734132: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 220)
algo-1-4gcx7_1  | 2020-06-22 03:34:25.815997: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:34:25.816063: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:34:25.916490: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:25.916545: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:25.916567: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:25.916597: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:25.916612: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:25.928449: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:34:25.993848: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:34:26.455292: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:34:28 2020, 520504us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:34:28 2020, 520579us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-6EEACF89-38C3-4B8E-8E48-E2DE7A6544DC  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:34:28.525562: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 246)
algo-1-4gcx7_1  | 2020-06-22 03:34:28.607112: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:34:28.607185: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:34:28.707611: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:28.707666: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:28.707702: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:28.707733: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:28.707750: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:28.719507: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:34:28.782931: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:34:29.234765: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:34:31 2020, 289776us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:34:31 2020, 289844us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-F286131D-9203-4DC6-8813-478CBC1946A0  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:34:31.294411: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 272)
algo-1-4gcx7_1  | 2020-06-22 03:34:31.378925: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:34:31.378994: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:34:31.479471: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:31.479526: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:31.479549: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:31.479583: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:31.479600: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:31.491719: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:34:31.556196: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:34:32.018491: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:34:34 2020, 076079us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:34:34 2020, 076147us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-B6222F2B-719F-404F-9AEF-D6952095FBCF  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:34:34.080668: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 298)
algo-1-4gcx7_1  | 2020-06-22 03:34:34.163576: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:34:34.163642: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:34:34.264058: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:34.264112: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:34.264135: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:34.264191: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:34.264210: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:34.276670: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:34:34.340455: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:34:34.792800: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:34:36 2020, 855123us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:34:36 2020, 855186us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-63BDADD1-CE09-4528-94C7-3A79D10BEDB1  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:34:36.859702: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 324)
algo-1-4gcx7_1  | 2020-06-22 03:34:36.941256: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:34:36.941326: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:34:37.041806: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:37.041861: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:37.041879: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:37.041909: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:37.041927: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:37.053708: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:34:37.121272: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:34:37.571138: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:34:39 2020, 632154us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:34:39 2020, 632222us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-F01ECC9B-531C-4239-AC3A-1F7545178872  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:34:39.637230: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 350)
algo-1-4gcx7_1  | 2020-06-22 03:34:39.718670: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:34:39.718735: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:34:39.819180: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:39.819236: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:39.819259: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:39.819292: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:39.819310: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:39.831293: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:34:39.896682: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:34:40.350546: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:34:42 2020, 413993us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:34:42 2020, 414059us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-1DDAF72C-2392-4176-A2C0-9EAC5861EDAA  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:34:42.418665: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 376)
algo-1-4gcx7_1  | 2020-06-22 03:34:42.500814: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:34:42.500883: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:34:42.601333: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:42.601387: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:42.601405: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:42.601439: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:42.601456: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:42.613361: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:34:42.677765: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:34:43.143410: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:34:45 2020, 201714us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:34:45 2020, 201783us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-6128140C-1D17-43BA-AAE8-733F8B33DAD0  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:34:45.206360: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 402)
algo-1-4gcx7_1  | 2020-06-22 03:34:45.287213: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:34:45.287280: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:34:45.387698: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:45.387751: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:45.387773: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:45.387804: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:45.387818: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:45.399687: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:34:45.465684: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:34:45.924224: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:34:47 2020, 983789us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:34:47 2020, 983871us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-3CAFBFB5-E5E3-44E7-9EF2-298CD03978DA  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:34:47.988970: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 428)
algo-1-4gcx7_1  | 2020-06-22 03:34:48.070970: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:34:48.071038: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:34:48.171476: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:48.171531: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:48.171555: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:48.171589: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:48.171606: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:48.183456: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:34:48.248450: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:34:48.713646: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:34:50 2020, 770286us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:34:50 2020, 770359us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-4EEB2460-4E4A-4327-A363-04E162FB0EF7  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:34:50.775147: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 454)
algo-1-4gcx7_1  | 2020-06-22 03:34:50.857653: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:34:50.857722: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:34:50.958137: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:50.958189: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:50.958208: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:50.958241: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:50.958258: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:50.969968: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:34:51.035208: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:34:51.480539: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:34:53 2020, 544826us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:34:53 2020, 544903us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-7BC89CFB-3409-48D2-B20B-0CC7163C1AC1  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:34:53.550289: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 480)
algo-1-4gcx7_1  | 2020-06-22 03:34:53.632652: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:34:53.632720: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:34:53.733123: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:53.733177: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:53.733195: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:53.733227: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:53.733247: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:53.745619: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:34:53.809844: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:34:54.272738: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:34:56 2020, 329945us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:34:56 2020, 330014us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-00DA5DD2-8CA7-49B1-B4AD-008A0359447C  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:34:56.334698: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 506)
algo-1-4gcx7_1  | 2020-06-22 03:34:56.416946: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:34:56.417014: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:34:56.517481: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:56.517536: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:56.517647: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:56.517684: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:56.517700: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:56.530063: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:34:56.593850: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:34:57.049436: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:34:59 2020, 106002us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:34:59 2020, 106072us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-6FFEB16C-2616-4DFC-BD03-39A0CBDC0E87  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:34:59.110584: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 532)
algo-1-4gcx7_1  | 2020-06-22 03:34:59.192144: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:34:59.192215: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:34:59.292652: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:59.292707: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:59.292726: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:34:59.292760: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:59.292778: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:34:59.304824: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:34:59.369138: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:34:59.830064: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:35:01 2020, 895886us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:35:01 2020, 895960us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-205136E3-B6C4-4A35-A8D0-328011F8B4BD  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:35:01.900716: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 558)
algo-1-4gcx7_1  | 2020-06-22 03:35:01.985072: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:35:01.985142: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:35:02.085583: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:02.085640: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:02.085661: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:02.085694: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:02.085710: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:02.097740: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:35:02.166409: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:35:02.636486: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:35:04 2020, 697223us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:35:04 2020, 697296us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-46367CE4-8D94-4B62-B61D-5E379AB5CEF6  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:35:04.701887: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 584)
algo-1-4gcx7_1  | 2020-06-22 03:35:04.784580: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:35:04.784651: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:35:04.885109: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:04.885166: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:04.885192: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:04.885223: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:04.885239: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:04.897117: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:35:04.964931: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:35:05.440694: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:35:07 2020, 500853us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:35:07 2020, 500920us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-45A1ACC4-B4CD-4EFF-AF46-777D219DA577  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:35:07.505561: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 610)
algo-1-4gcx7_1  | 2020-06-22 03:35:07.588770: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:35:07.588841: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:35:07.689290: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:07.689350: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:07.689370: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:07.689419: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:07.689436: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:07.701354: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:35:07.767717: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:35:08.233984: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:35:10 2020, 284422us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:35:10 2020, 284494us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-0A95B677-44B6-48A2-BF65-D01CD2E823F2  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:35:10.289212: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 636)
algo-1-4gcx7_1  | 2020-06-22 03:35:10.371862: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:35:10.371929: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:35:10.472385: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:10.472440: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:10.472458: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:10.472579: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:10.472600: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:10.484905: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:35:10.550642: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:35:11.027707: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:35:13 2020, 092051us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:35:13 2020, 092121us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-9ED1D871-4514-4399-8BB4-AE0CCCC54437  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:35:13.096673: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 662)
algo-1-4gcx7_1  | 2020-06-22 03:35:13.179308: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:35:13.179373: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:35:13.279871: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:13.279925: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:13.279946: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:13.279977: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:13.279992: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:13.292566: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:35:13.359793: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:35:13.816138: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:35:15 2020, 889463us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:35:15 2020, 889530us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-7C5FE763-3C15-44F6-A762-DB59D6E71556  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:35:15.894353: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 688)
algo-1-4gcx7_1  | 2020-06-22 03:35:15.977147: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:35:15.977214: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:35:16.077642: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:16.077697: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:16.077720: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:16.077751: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:16.077768: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:16.089588: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:35:16.156215: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:35:16.613682: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:35:18 2020, 677759us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:35:18 2020, 677824us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-256D4CFC-6EBA-4C47-BDA6-327B4BF2BB2C  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:35:18.682393: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 714)
algo-1-4gcx7_1  | 2020-06-22 03:35:18.763382: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:35:18.763449: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:35:18.863937: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:18.863995: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:18.864023: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:18.864061: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:18.864085: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:18.876323: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:35:18.942036: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:35:19.396015: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:35:21 2020, 441928us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:35:21 2020, 441998us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-4CEEA767-2C4C-46C1-9DCC-0BA47093E3D3  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:35:21.446600: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 740)
algo-1-4gcx7_1  | 2020-06-22 03:35:21.528714: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:35:21.528783: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:35:21.629202: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:21.629256: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:21.629278: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:21.629335: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:21.629351: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:21.642405: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:35:21.706832: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:35:22.164834: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:35:24 2020, 222983us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:35:24 2020, 223055us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-5B90C7E3-2BFF-4C05-8C0E-0213465C85B2  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:35:24.227697: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 766)
algo-1-4gcx7_1  | 2020-06-22 03:35:24.311628: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:35:24.311694: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:35:24.412143: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:24.412200: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:24.412219: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:24.412259: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:24.412275: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:24.424017: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:35:24.489817: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:35:24.938553: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:35:26 2020, 995618us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:35:26 2020, 995681us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-721727B8-8803-485B-8375-10F3702EF31F  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:35:27.000327: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 792)
algo-1-4gcx7_1  | 2020-06-22 03:35:27.082804: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:35:27.082870: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:35:27.183300: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:27.183356: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:27.183374: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:27.183408: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:27.183424: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:27.195153: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:35:27.258752: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:35:27.704869: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:35:29 2020, 750481us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:35:29 2020, 750549us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-8CDEB655-E3EB-4DDD-B295-20C99665918B  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:35:29.755137: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 818)
algo-1-4gcx7_1  | 2020-06-22 03:35:29.836971: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:35:29.837041: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:35:29.937530: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:29.937585: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:29.937604: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:29.937663: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:29.937683: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:29.949975: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:35:30.019567: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:35:30.485321: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:35:32 2020, 540846us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:35:32 2020, 540916us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-03A6C0AA-F1E8-424F-B4F8-27B5311FE36F  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:35:32.545409: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 844)
algo-1-4gcx7_1  | 2020-06-22 03:35:32.627906: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:35:32.627973: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:35:32.728423: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:32.728477: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:32.728496: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:32.728536: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:32.728561: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:32.740518: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:35:32.804140: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:35:33.263307: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:35:35 2020, 313768us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:35:35 2020, 313837us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-0FCD0F9D-23DA-4738-9B86-6BEEF92DFD09  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:35:35.319024: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 870)
algo-1-4gcx7_1  | 2020-06-22 03:35:35.399845: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:35:35.399916: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:35:35.500338: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:35.500392: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:35.500416: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:35.500447: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:35.500555: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:35.512512: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:35:35.575514: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:35:36.023954: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:35:38 2020, 081741us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:35:38 2020, 081807us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-3600C277-4949-4D19-9C22-FB02CBE7B843  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:35:38.086392: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 896)
algo-1-4gcx7_1  | 2020-06-22 03:35:38.167871: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:35:38.167935: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:35:38.268372: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:38.268424: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:38.268446: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:38.268475: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:38.268490: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:38.280200: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:35:38.343311: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:35:38.793324: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:35:40 2020, 842486us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:35:40 2020, 842609us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-9CE09063-27F8-4753-9B30-47F56045E3D0  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:35:40.848721: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 922)
algo-1-4gcx7_1  | 2020-06-22 03:35:40.934299: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:35:40.934374: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:35:40.936407: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:40.936440: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:40.937224: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:40.937631: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:40.937664: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 03:35:40.950564: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 03:35:41.021308: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 03:35:41.493226: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 03:35:43 2020, 553350us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 03:35:43 2020, 553418us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-39AAE93E-F395-42F6-B2DA-36F11166A277  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 03:35:43.559171: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 948)
algo-1-4gcx7_1  | 2020-06-22 03:35:43.642227: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 03:35:43.642293: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 03:35:43.742768: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:43.742828: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:43.742851: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 03:35:43.742910: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769

`
It's never ending log.

![image](https://user-images.githubusercontent.com/3924040/85247795-8302c880-b481-11ea-8604-8c67d0f20f5f.png)

`
algo-1-4gcx7_1  | 2020/06/22 04:11:10 [error] 11#11: *2 connect() failed (111: Connection refused) while connecting to upstream, client: 172.18.0.1, server: , request: "POST /invocations HTTP/1.1", subrequest: "/v1/models/Servo:predict", upstream: "http://127.0.0.1:8501/v1/models/Servo:predict", host: "localhost:8080"
algo-1-4gcx7_1  | 2020/06/22 04:11:10 [error] 11#11: *2 connect() failed (111: Connection refused) while connecting to upstream, client: 172.18.0.1, server: , request: "POST /invocations HTTP/1.1", subrequest: "/v1/models/Servo:predict", upstream: "http://127.0.0.1:8501/v1/models/Servo:predict", host: "localhost:8080"
algo-1-4gcx7_1  | 172.18.0.1 - - [22/Jun/2020:04:11:10 +0000] "POST /invocations HTTP/1.1" 502 157 "-" "-"
---------------------------------------------------------------------------
JSONDecodeError                           Traceback (most recent call last)
<timed exec> in <module>()

~/anaconda3/envs/amazonei_tensorflow_p36/lib/python3.6/site-packages/sagemaker/tensorflow/serving.py in predict(self, data, initial_args)
    116                 args["CustomAttributes"] = self._model_attributes
    117 
--> 118         return super(Predictor, self).predict(data, args)
    119 
    120 

~/anaconda3/envs/amazonei_tensorflow_p36/lib/python3.6/site-packages/sagemaker/predictor.py in predict(self, data, initial_args, target_model)
    109         request_args = self._create_request_args(data, initial_args, target_model)
    110         response = self.sagemaker_session.sagemaker_runtime_client.invoke_endpoint(**request_args)
--> 111         return self._handle_response(response)
    112 
    113     def _handle_response(self, response):

~/anaconda3/envs/amazonei_tensorflow_p36/lib/python3.6/site-packages/sagemaker/predictor.py in _handle_response(self, response)
    119         if self.deserializer is not None:
    120             # It's the deserializer's responsibility to close the stream
--> 121             return self.deserializer(response_body, response["ContentType"])
    122         data = response_body.read()
    123         response_body.close()

~/anaconda3/envs/amazonei_tensorflow_p36/lib/python3.6/site-packages/sagemaker/predictor.py in __call__(self, stream, content_type)
    578         """
    579         try:
--> 580             return json.load(codecs.getreader("utf-8")(stream))
    581         finally:
    582             stream.close()

~/anaconda3/envs/amazonei_tensorflow_p36/lib/python3.6/json/__init__.py in load(fp, cls, object_hook, parse_float, parse_int, parse_constant, object_pairs_hook, **kw)
    297         cls=cls, object_hook=object_hook,
    298         parse_float=parse_float, parse_int=parse_int,
--> 299         parse_constant=parse_constant, object_pairs_hook=object_pairs_hook, **kw)
    300 
    301 

~/anaconda3/envs/amazonei_tensorflow_p36/lib/python3.6/json/__init__.py in loads(s, encoding, cls, object_hook, parse_float, parse_int, parse_constant, object_pairs_hook, **kw)
    352             parse_int is None and parse_float is None and
    353             parse_constant is None and object_pairs_hook is None and not kw):
--> 354         return _default_decoder.decode(s)
    355     if cls is None:
    356         cls = JSONDecoder

~/anaconda3/envs/amazonei_tensorflow_p36/lib/python3.6/json/decoder.py in decode(self, s, _w)
    337 
    338         """
--> 339         obj, end = self.raw_decode(s, idx=_w(s, 0).end())
    340         end = _w(s, end).end()
    341         if end != len(s):

~/anaconda3/envs/amazonei_tensorflow_p36/lib/python3.6/json/decoder.py in raw_decode(self, s, idx)
    355             obj, end = self.scan_once(s, idx)
    356         except StopIteration as err:
--> 357             raise JSONDecodeError("Expecting value", s, err.value) from None
    358         return obj, end

JSONDecodeError: Expecting value: line 1 column 1 (char 0)

algo-1-4gcx7_1  | [Mon Jun 22 04:11:12 2020, 058078us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 04:11:12 2020, 058140us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-4667CCC2-CFF1-4067-AFAD-4F001B17C7D3  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 04:11:12.062818: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 20890)
algo-1-4gcx7_1  | 2020-06-22 04:11:12.147005: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 04:11:12.147071: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 04:11:12.247496: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:12.247554: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:12.247573: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:12.247609: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 04:11:12.247626: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 04:11:12.259613: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 04:11:12.323682: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 04:11:12.773203: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 04:11:14 2020, 817221us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 04:11:14 2020, 817299us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-C5C181E6-D195-43E1-B04B-FA59B443DCBC  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 04:11:14.822033: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 20916)
algo-1-4gcx7_1  | 2020-06-22 04:11:14.906949: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 04:11:14.907027: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 04:11:15.007503: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:15.007562: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:15.007592: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:15.007631: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 04:11:15.007657: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 04:11:15.019905: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 04:11:15.087212: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 04:11:15.543881: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 04:11:17 2020, 584942us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 04:11:17 2020, 585012us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-4D17CB70-E666-47ED-9446-A8EB538B9966  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 04:11:17.589641: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 20942)
algo-1-4gcx7_1  | 2020-06-22 04:11:17.674695: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 04:11:17.674764: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 04:11:17.775226: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:17.775288: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:17.775309: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:17.775341: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 04:11:17.775357: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 04:11:17.787411: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 04:11:17.857321: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 04:11:18.331722: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 04:11:20 2020, 374018us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 04:11:20 2020, 374085us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-51C7B5EE-3AA5-465D-89E3-1A9AB06ADA42  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 04:11:20.379110: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 20968)
algo-1-4gcx7_1  | 2020-06-22 04:11:20.462339: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 04:11:20.462403: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 04:11:20.562862: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:20.562917: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:20.562936: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:20.562996: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 04:11:20.563015: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 04:11:20.575210: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 04:11:20.638859: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 04:11:21.093517: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 04:11:23 2020, 133255us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 04:11:23 2020, 133323us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-EF8818D0-F68A-4575-B969-7E6CCF0C690A  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 04:11:23.137958: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 20994)
algo-1-4gcx7_1  | 2020-06-22 04:11:23.222149: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 04:11:23.222218: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 04:11:23.322687: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:23.322741: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:23.322759: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:23.322793: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 04:11:23.322810: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 04:11:23.335005: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 04:11:23.399138: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 04:11:23.845444: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 04:11:25 2020, 885917us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 04:11:25 2020, 885993us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-ED5686B9-134A-44DF-B081-A1A010E84B8B  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 04:11:25.890740: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 21020)
algo-1-4gcx7_1  | 2020-06-22 04:11:25.974394: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 04:11:25.974459: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 04:11:26.074865: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:26.074918: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:26.074936: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:26.074968: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 04:11:26.074984: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 04:11:26.087098: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 04:11:26.151425: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 04:11:26.613768: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 04:11:28 2020, 653662us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 04:11:28 2020, 653733us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-5BC00F0B-7973-436E-8840-605768C5C682  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 04:11:28.658378: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 21046)
algo-1-4gcx7_1  | 2020-06-22 04:11:28.741760: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 04:11:28.741823: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 04:11:28.842286: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:28.842343: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:28.842410: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:28.842597: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 04:11:28.842619: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 04:11:28.854820: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 04:11:28.921543: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 04:11:29.382022: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 04:11:31 2020, 423270us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 04:11:31 2020, 423342us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-235E01D7-8394-46AA-A089-B25272F2CD7A  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 04:11:31.428025: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 21072)
algo-1-4gcx7_1  | 2020-06-22 04:11:31.511788: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 04:11:31.511873: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 04:11:31.612293: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:31.612352: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:31.612374: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:31.612435: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 04:11:31.612452: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 04:11:31.624490: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 04:11:31.688561: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 04:11:32.146470: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 04:11:34 2020, 186265us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 04:11:34 2020, 186335us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-A3BC8E15-17A4-48A1-A3DF-1C8C823A2A38  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 04:11:34.190940: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 21098)
algo-1-4gcx7_1  | 2020-06-22 04:11:34.274164: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 04:11:34.274230: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 04:11:34.374696: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:34.374752: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:34.374773: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:34.374803: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 04:11:34.374969: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 04:11:34.386948: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 04:11:34.450503: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 04:11:34.902624: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 04:11:36 2020, 943819us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 04:11:36 2020, 943910us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-6E59E04C-F00E-47DE-9480-CF163C6B253B  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 04:11:36.949251: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 21124)
algo-1-4gcx7_1  | 2020-06-22 04:11:37.033776: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 04:11:37.033847: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 04:11:37.134286: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:37.134340: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:37.134361: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:37.134390: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 04:11:37.134405: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 04:11:37.146367: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 04:11:37.211514: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 04:11:37.683065: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 04:11:39 2020, 727877us] [Execution Engine] Error getting application context for [TensorFlow][2]
algo-1-4gcx7_1  | [Mon Jun 22 04:11:39 2020, 727934us] [Execution Engine][TensorFlow][2] Failed - Last Error: 
algo-1-4gcx7_1  |     EI Error Code: [3, 16, 8]
algo-1-4gcx7_1  |     EI Error Description: Unable to authenticate with accelerator
algo-1-4gcx7_1  |     EI Request ID: TF-3228B4D2-D89D-456C-B80C-1F28A5170F33  --  EI Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  |     EI Client Version: 1.5.3
algo-1-4gcx7_1  | 2020-06-22 04:11:39.733832: F external/org_tensorflow/tensorflow/contrib/ei/session/eia_session.cc:1219] Non-OK-status: SwapExStateWithEI(tmp_inputs, tmp_outputs, tmp_freeze) status: Internal: Failed to get the initial operator whitelist from server.
algo-1-4gcx7_1  | WARNING:__main__:unexpected tensorflow serving exit (status: 6). restarting.
algo-1-4gcx7_1  | INFO:__main__:tensorflow version info:
algo-1-4gcx7_1  | TensorFlow ModelServer: 1.14.0-rc0+dev.sha.34d9e85
algo-1-4gcx7_1  | TensorFlow Library: 1.14.0
algo-1-4gcx7_1  | EI Version: EI-1.4
algo-1-4gcx7_1  | INFO:__main__:tensorflow serving command: tensorflow_model_server --port=9000 --rest_api_port=8501 --model_config_file=/sagemaker/model-config.cfg 
algo-1-4gcx7_1  | INFO:__main__:started tensorflow serving (pid: 21150)
algo-1-4gcx7_1  | 2020-06-22 04:11:39.822169: I tensorflow_serving/model_servers/server_core.cc:462] Adding/updating models.
algo-1-4gcx7_1  | 2020-06-22 04:11:39.822989: I tensorflow_serving/model_servers/server_core.cc:561]  (Re-)adding model: Servo
algo-1-4gcx7_1  | 2020-06-22 04:11:39.924177: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:39.924970: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:39.925403: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: Servo version: 1527887769}
algo-1-4gcx7_1  | 2020-06-22 04:11:39.925846: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:363] Attempting to load native SavedModelBundle in bundle-shim from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 04:11:39.926267: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | 2020-06-22 04:11:39.940169: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:54] Reading meta graph with tags { serve }
algo-1-4gcx7_1  | 2020-06-22 04:11:40.010413: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:202] Restoring SavedModel bundle.
algo-1-4gcx7_1  | 2020-06-22 04:11:40.488278: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:151] Running initialization op on SavedModel bundle at path: /opt/ml/model/export/Servo/1527887769
algo-1-4gcx7_1  | Using Amazon Elastic Inference Client Library Version: 1.5.3
algo-1-4gcx7_1  | Number of Elastic Inference Accelerators Available: 1
algo-1-4gcx7_1  | Elastic Inference Accelerator ID: eia-9ff21bd88cbb4f4dad5bea3d6884f322
algo-1-4gcx7_1  | Elastic Inference Accelerator Type: eia1.medium
algo-1-4gcx7_1  | Elastic Inference Accelerator Ordinal: 0
algo-1-4gcx7_1  | 
algo-1-4gcx7_1  | [Mon Jun 22 04:11:42 2020, 541388us] [Execution Engine] Error getting application context for [TensorFlow][2]

Again it's never ending log. BTW, I have just access to list our s3 buckets. So, I first downloaded the model from aws public bucket and uploaded to mine: image

pankajxyz commented 4 years ago

Solved it. Suspected that it's related to role/permission etc. Asked our devops. They tried few things and finally it worked.

pankajxyz commented 4 years ago

After solving the above. Now, I am trying my own tensorflow serving model (exported from estimator). It can generate the output/prediction: image

But need to check for performance because of following message: EI may incur sub-optimal inference latency on this model due to below operators. Please contact amazon-ei-feedback@amazon.com with this message if inference latency does not meet your application requirements algo-1-0bp9q_1 | Operators: BatchMatMulV2

Can you please help?

laurenyu commented 4 years ago

I'll pass this along to the EI team to see if they have any advice for the message you're getting. Did you also try emailing amazon-ei-feedback@amazon.com?

pankajxyz commented 4 years ago

In parallel, I am asking AWS support centre as well. We have developer support plan. Surprisingly, response over there is much slower than here. Thanks for your help. Waiting for the advice.

mmiaz commented 3 years ago

role/permission

hi, do you mind sharing what's the permission issue? i was following the prerequisites https://docs.aws.amazon.com/elastic-inference/latest/developerguide/setting-up-ei.html to make up the IAM policy, but still get this error. Would you mind sharing the solution?

sklipnoty commented 3 years ago

role/permission

hi, do you mind sharing what's the permission issue? i was following the prerequisites https://docs.aws.amazon.com/elastic-inference/latest/developerguide/setting-up-ei.html to make up the IAM policy, but still get this error. Would you mind sharing the solution?

Did you get a potential solution for this? I am facing the same issue and config seems fine.

kevinyaoxm commented 3 years ago

would you mind sharing your IAM policy configured?

mmiaz commented 3 years ago

role/permission

hi, do you mind sharing what's the permission issue? i was following the prerequisites https://docs.aws.amazon.com/elastic-inference/latest/developerguide/setting-up-ei.html to make up the IAM policy, but still get this error. Would you mind sharing the solution?

Did you get a potential solution for this? I am facing the same issue and config seems fine.

@sklipnoty Hi, role configs shall be the same. But I think that time what helped was to

  1. choose the right region (https://aws.amazon.com/machine-learning/elastic-inference/pricing/) and
  2. make sure we pick Deep Learning AMI (Ubuntu 16.04) rather than Deep Learning AMI (Ubuntu 18.04). But ofc it was a few months back, please check the updated documentation towards AMI options. Thanks