tobegit3hub / simple_tensorflow_serving

Generic and easy-to-use serving service for machine learning models
https://stfs.readthedocs.io
Apache License 2.0
757 stars 195 forks source link

Container fails with errors - docker - tobegit3hub/simple_tensorflow_serving:latest-py34 #58

Open ashbeats opened 5 years ago

ashbeats commented 5 years ago

@tobegit3hub

I'm trying to run the docker instance on windows 10.

(base) λ docker --version
Docker version 18.09.2, build 6247962

(base) λ docker run -d -p 8500:8500 tobegit3hub/simple_tensorflow_serving:latest-py34

Docker install completes properly with no errors. The container also startups with no obvious errors.

But, the endpoint at http://127.0.0.1:8500/ returns an internal server error.

(base) λ curl http://127.0.0.1:8500/
Internal Server Error

On inspection, docker's logs shows:

(base) λ docker logs 0b8b9968c3b9 
[uWSGI] getting INI configuration from /tmp/uwsgi.ini
*** Starting uWSGI 2.0.18 (64bit) on [Sat Jul 13 03:56:44 2019] ***
compiled with version: 6.3.0 20170516 on 10 July 2019 06:45:50
os: Linux-4.9.125-linuxkit #1 SMP Fri Sep 7 08:20:28 UTC 2018
nodename: 0b8b9968c3b9
machine: x86_64
clock source: unix
pcre jit disabled
detected number of CPU cores: 2
current working directory: /simple_tensorflow_serving
writing pidfile to /tmp/uwsgi.pid
detected binary path: /usr/local/bin/uwsgi
uWSGI running as root, you can use --uid/--gid/--chroot options
*** WARNING: you are running uWSGI as root !!! (use the --uid flag) ***
your memory page size is 4096 bytes
detected max file descriptor number: 1048576
lock engine: pthread robust mutexes
thunder lock: disabled (you can enable it with --thunder-lock)
uWSGI http bound on 0.0.0.0:8500 fd 3
uwsgi socket 0 bound to UNIX address /tmp/uwsgi.sock fd 6
uWSGI running as root, you can use --uid/--gid/--chroot options
*** WARNING: you are running uWSGI as root !!! (use the --uid flag) ***
Python version: 3.4.10 (default, Mar 20 2019, 00:50:15)  [GCC 6.3.0 20170516]
Python main interpreter initialized at 0x5573028aa050
uWSGI running as root, you can use --uid/--gid/--chroot options
*** WARNING: you are running uWSGI as root !!! (use the --uid flag) ***
python threads support enabled
your server socket listen backlog is limited to 100 connections
your mercy for graceful operations on workers is 60 seconds
mapped 145840 bytes (142 KB) for 1 cores
*** Operational MODE: single process ***
Traceback (most recent call last):
  File "./simple_tensorflow_serving/server.py", line 19, in <module>
    from manager import InferenceServiceManager
  File "./simple_tensorflow_serving/./manager.py", line 16, in <module>
    from tensorflow_inference_service import TensorFlowInferenceService
  File "./simple_tensorflow_serving/./tensorflow_inference_service.py", line 12, in <module>
    import tensorflow as tf
  File "/usr/local/lib/python3.4/site-packages/tensorflow/__init__.py", line 35, in <module>
    from tensorflow._api.v1 import compat
  File "/usr/local/lib/python3.4/site-packages/tensorflow/_api/v1/compat/__init__.py", line 21, in <module>
    from tensorflow._api.v1.compat import v1
  File "/usr/local/lib/python3.4/site-packages/tensorflow/_api/v1/compat/v1/__init__.py", line 649, in <module>
    from tensorflow_estimator.python.estimator.api._v1 import estimator
  File "/usr/local/lib/python3.4/site-packages/tensorflow_estimator/__init__.py", line 8, in <module>
    from tensorflow_estimator._api.v1 import estimator
  File "/usr/local/lib/python3.4/site-packages/tensorflow_estimator/_api/v1/estimator/__init__.py", line 8, in <module>
    from tensorflow_estimator._api.v1.estimator import experimental
  File "/usr/local/lib/python3.4/site-packages/tensorflow_estimator/_api/v1/estimator/experimental/__init__.py", line 8, in <module>
    from tensorflow_estimator.python.estimator.canned.dnn import dnn_logit_fn_builder
  File "/usr/local/lib/python3.4/site-packages/tensorflow_estimator/python/estimator/__init__.py", line 25, in <module>
    import tensorflow_estimator.python.estimator.estimator_lib
  File "/usr/local/lib/python3.4/site-packages/tensorflow_estimator/python/estimator/estimator_lib.py", line 53, in <module>
    from tensorflow_estimator.python.estimator.inputs import inputs
  File "/usr/local/lib/python3.4/site-packages/tensorflow_estimator/python/estimator/inputs/inputs.py", line 23, in <module>
    from tensorflow_estimator.python.estimator.inputs.pandas_io import pandas_input_fn
  File "/usr/local/lib/python3.4/site-packages/tensorflow_estimator/python/estimator/inputs/pandas_io.py", line 31, in <module>
    import pandas as pd
  File "/usr/local/lib/python3.4/site-packages/pandas-0.25.0rc0-py3.4-linux-x86_64.egg/pandas/__init__.py", line 30, in <module>
    from pandas._libs import hashtable as _hashtable, lib as _lib, tslib as _tslib
  File "/usr/local/lib/python3.4/site-packages/pandas-0.25.0rc0-py3.4-linux-x86_64.egg/pandas/_libs/__init__.py", line 3, in <module>
    from .tslibs import (
  File "/usr/local/lib/python3.4/site-packages/pandas-0.25.0rc0-py3.4-linux-x86_64.egg/pandas/_libs/tslibs/__init__.py", line 3, in <module>
    from .conversion import localize_pydatetime, normalize_date
  File "pandas/_libs/tslibs/conversion.pyx", line 234, in init pandas._libs.tslibs.conversion
AttributeError: type object 'pandas._libs.tslibs.conversion._TSObject' has no attribute '__reduce_cython__'
unable to load app 0 (mountpoint='') (callable not found or import error)
*** no app loaded. going in full dynamic mode ***
uWSGI running as root, you can use --uid/--gid/--chroot options
*** WARNING: you are running uWSGI as root !!! (use the --uid flag) ***
*** uWSGI is running in multiple interpreter mode ***
spawned uWSGI master process (pid: 9)
spawned uWSGI worker 1 (pid: 13, cores: 1)
spawned uWSGI http 1 (pid: 14)

Any ideas?

tobegit3hub commented 5 years ago

It seems to be the compatible problem for the latest code with uwsgi and pandas.

The code for py34 is not tuned yet so it may has some bugs. You may try install with source or pip install with older version like simple-tensorflow-serving==0.6.6.