dsindex / syntaxnet

reference code for syntaxnet
196 stars 57 forks source link

Serving different language model #2 - Export #19

Closed bazhura closed 7 years ago

bazhura commented 7 years ago

Hello @dsindex Thank you for bringing up this project, I believe it's magic to have it.

Struggling to put things together to serve Croatian model from parsey's cousins on tensorflow serving. (Let's call it "Different-Lang" model). I have reached the point where the trained model needs to be exported, but I cannot get my head around it. Both models_tf in serving and syntaxnet for training was checkout at a4b7bb9a5dd2c021edcd3d68d326255c734d0ef0 .

  1. First things first, I built TF serving as described.

    git checkout 89e9dfbea055027bc31878ee8da66b54a701a746
    git submodule update --init --recursive
    cd models_tf 
    git checkout a4b7bb9a5dd2c021edcd3d68d326255c734d0ef0
    ...
    cd serving
    bazel --output_user_root=bazel_root build --nocheck_visibility -c opt -s //tensorflow_serving/example:parsey_api --genrule_strategy=standalone --spawn_strategy=standalone --verbose_failures
  2. Next, built SyntaxNet for CPU on a different machine, like a separate project

    cd syntaxnet/tensorflow
    ./configure
    cd ..
    bazel test syntaxnet/... util/utf8/...
  3. Test Parser Trainer

    ./parser_trainer_test.sh
    # INFO:tensorflow:Seconds elapsed in evaluation: 0.30, eval metric: 89.83%
    # + echo PASS
    # PASS
  4. Download UD and train Different-Lang model

    
    (downloading ud-treebanks-v2.0.tgz)
    # copy UD_Different-Lang folder to “work” folder including there 2 files : ...-ud-dev.conllu / ...-ud-train.conllu. There is no test set in UD corpus v.2.0, so I substitute it with dev.

add the context.pbtxt and update file location value, file names “...-ud-...” + record-format to "different-lang-text"

./train.sh -v -v


So now I ended up having a project with TF serving (produced at step 1), and a separate project where I trained the model. This latter project has the following structure of directories:
├── autoencoder
├── inception
├── namignizer
├── neural_gpu
├── swivel
├── syntaxnet
│   ├── syntaxnet
│   │   ├── models
│   │   │   └── parsey_mcparseface
│   │   ├── ops
│   │   └── testdata
│   ├── tensorflow
│   │   ├── tools
│   │   └── util
│   │       └── python
│   │           ├── python_include -> /usr/include/python2.7
│   │           └── python_lib -> /usr/lib/python2.7/dist-packages
│   ├── third_party
│   │   └── utf
│   ├── tools
│   ├── util
│   │   └── utf8
│   └── work
│       ├── api
│       │   ├── parsey_client
│       │   │   └── api
│       │   │       ├── cali
│       │   │       │   └── nlp
│       │   │       └── syntaxnet
│       │   └── parsey_model
│       │       └── assets
│       ├── corpus
│       │   └── ud-treebanks-v2.0
│       ├── English
│       ├── models
│       ├── models_sejong
│       ├── sejong
│       ├── testdata
│       │   └── tmp
│       │       └── syntaxnet-output
│       │           └── brain_parser
│       │               ├── greedy
│       │               │   └── 128-0.08-3600-0.9-0
│       │               └── structured
│       │                   └── 128-0.001-3600-0.9-0
│       ├── UD_Different-Lang
│       │   └── tmp
│       │       └── syntaxnet-output
│       │           ├── brain_parser
│       │           │   ├── greedy
│       │           │   │   └── 512x512-0.08-4400-0.85-4
│       │           │   └── structured
│       │           │       └── 512x512-0.02-100-0.9-0
│       │           └── brain_pos
│       │               └── greedy
│       │                   └── 64-0.08-3600-0.9-0
│       └── UD_English
└── transformer
    └── data

Also completed these steps:

$ cp ../api/parsey_mcparseface.py tensorflow_serving/example $ bazel --output_user_root=bazel_root build --nocheck_visibility -c opt -s //tensorflow_serving/example:parsey_mcparseface --genrule_strategy=standalone --spawn_strategy=standalone --verbose_failures $ ls bazel-bin/tensorflow_serving/example/parsey_mcparseface

Then I simply copy and paste UD_Different-Lang folder with all trained results to TF Serving work folder side by side with UD_English and set path: 

$ cat ../models/context.pbtxt.template | sed "s=OUTPATH=/home/alina/work/UD_Different-Lang/tmp/syntaxnet-output/brain_pos/greedy/64-0.08-3600-0.9-0=" > ../models/context.pbtxt $ bazel-bin/tensorflow_serving/example/parsey_mcparseface --model_dir=../models --export_path=exported


However, this produced an error. Where did I get it wrong? Thank you!
`bazel-bin/tensorflow_serving/example/parsey_mcparseface --model_dir=../models --export_path=exported
I external/syntaxnet/syntaxnet/term_frequency_map.cc:101] Loaded 41 terms from /home/alina/work/UD_Croatian/tmp/syntaxnet-output/brain_pos/greedy/64-0.08-3600-0.9-0/label-map.
I external/syntaxnet/syntaxnet/embedding_feature_extractor.cc:35] Features: 
I external/syntaxnet/syntaxnet/embedding_feature_extractor.cc:36] Embedding names: 
I external/syntaxnet/syntaxnet/embedding_feature_extractor.cc:37] Embedding dims: 
I external/syntaxnet/syntaxnet/term_frequency_map.cc:101] Loaded 41 terms from /home/alina/work/UD_Croatian/tmp/syntaxnet-output/brain_pos/greedy/64-0.08-3600-0.9-0/label-map.
I external/syntaxnet/syntaxnet/embedding_feature_extractor.cc:35] Features: input.word input(1).word input(2).word input(3).word stack.word stack(1).word stack(2).word stack(3).word stack.child(1).word stack.child(1).sibling(-1).word stack.child(-1).word stack.child(-1).sibling(1).word stack(1).child(1).word stack(1).child(1).sibling(-1).word stack(1).child(-1).word stack(1).child(-1).sibling(1).word stack.child(2).word stack.child(-2).word stack(1).child(2).word stack(1).child(-2).word;input.tag input(1).tag input(2).tag input(3).tag stack.tag stack(1).tag stack(2).tag stack(3).tag stack.child(1).tag stack.child(1).sibling(-1).tag stack.child(-1).tag stack.child(-1).sibling(1).tag stack(1).child(1).tag stack(1).child(1).sibling(-1).tag stack(1).child(-1).tag stack(1).child(-1).sibling(1).tag stack.child(2).tag stack.child(-2).tag stack(1).child(2).tag stack(1).child(-2).tag;stack.child(1).label stack.child(1).sibling(-1).label stack.child(-1).label stack.child(-1).sibling(1).label stack(1).child(1).label stack(1).child(1).sibling(-1).label stack(1).child(-1).label stack(1).child(-1).sibling(1).label stack.child(2).label stack.child(-2).label stack(1).child(2).label stack(1).child(-2).label
I external/syntaxnet/syntaxnet/embedding_feature_extractor.cc:36] Embedding names: words;tags;labels
I external/syntaxnet/syntaxnet/embedding_feature_extractor.cc:37] Embedding dims: 64;32;32
I external/syntaxnet/syntaxnet/term_frequency_map.cc:101] Loaded 34340 terms from /home/alina/work/UD_Croatian/tmp/syntaxnet-output/brain_pos/greedy/64-0.08-3600-0.9-0/word-map.
I external/syntaxnet/syntaxnet/term_frequency_map.cc:101] Loaded 17 terms from /home/alina/work/UD_Croatian/tmp/syntaxnet-output/brain_pos/greedy/64-0.08-3600-0.9-0/tag-map.
Traceback (most recent call last):
  File "/home/alina/work/serving/bazel-bin/tensorflow_serving/example/parsey_mcparseface.runfiles/tf_serving/tensorflow_serving/example/parsey_mcparseface.py", line 188, in <module>
    tf.app.run()
  File "/home/alina/work/serving/bazel-bin/tensorflow_serving/example/parsey_mcparseface.runfiles/tf_serving/external/org_tensorflow/tensorflow/python/platform/app.py", line 30, in run
    sys.exit(main(sys.argv))
  File "/home/alina/work/serving/bazel-bin/tensorflow_serving/example/parsey_mcparseface.runfiles/tf_serving/tensorflow_serving/example/parsey_mcparseface.py", line 172, in main
    model[prefix]["documents"] = Build(sess, source, model[prefix])
  File "/home/alina/work/serving/bazel-bin/tensorflow_serving/example/parsey_mcparseface.runfiles/tf_serving/tensorflow_serving/example/parsey_mcparseface.py", line 75, in Build
    document_source=document_source)
  File "/home/alina/work/serving/bazel-bin/tensorflow_serving/example/parsey_mcparseface.runfiles/tf_serving/external/syntaxnet/syntaxnet/structured_graph_builder.py", line 242, in AddEvaluation
    document_source=document_source))
  File "/home/alina/work/serving/bazel-bin/tensorflow_serving/example/parsey_mcparseface.runfiles/tf_serving/external/syntaxnet/syntaxnet/structured_graph_builder.py", line 100, in _AddBeamReader
    documents_from_input=documents_from_input)
  File "/home/alina/work/serving/bazel-bin/tensorflow_serving/example/parsey_mcparseface.runfiles/tf_serving/external/syntaxnet/syntaxnet/ops/gen_parser_ops.py", line 100, in beam_parse_reader
    name=name)
  File "/home/alina/work/serving/bazel-bin/tensorflow_serving/example/parsey_mcparseface.runfiles/tf_serving/external/org_tensorflow/tensorflow/python/framework/op_def_library.py", line 627, in apply_op
    (key, op_type_name, attr_value.i, attr_def.minimum))
ValueError: Attr 'feature_size' of 'BeamParseReader' Op passed 0 less than minimum 1.
`
dsindex commented 7 years ago

@bazhura

what about do not patch? i means

$ git clone --recurse-submodules https://github.com/tensorflow/serving
# checkout proper version of serving
$ cd serving
$ git checkout 89e9dfbea055027bc31878ee8da66b54a701a746
$ git submodule update --init --recursive
# checkout proper version of tf_models
$ cd tf_models
$ git checkout a4b7bb9a5dd2c021edcd3d68d326255c734d0ef0

$ cd syntaxnet
$ cd tensorflow
$ ./configure
$ cd ..

# modify ./tensorflow/tensorflow/workspace.bzl for downloading 'zlib'
# native.new_http_archive(
#    name = "zlib_archive",
#    url = "http://zlib.net/fossils/zlib-1.2.8.tar.gz",
#    sha256 = "36658cb768a54c1d4dec43c3116c27ed893e88b02ecfcb44f2166f9c0b7f2a0d",
#    build_file = path_prefix + "zlib.BUILD",
#  )

$ bazel test --linkopt=-headerpad_max_install_names syntaxnet/... util/utf8/...

and then

$ git clone https://github.com/dsindex/syntaxnet.git work
$ cd 
....
$ ./train.sh
....

finally, export models and use it (say, /path/to/models)

# goto pathed-version
$ cat /path/to/models/context.pbtxt.template | sed "s=OUTPATH=/path/to/models=" > /path/to/models/context.pbtxt
$ bazel-bin/tensorflow_serving/example/parsey_mcparseface --model_dir=/path/to/models --export_path=exported

# modify all path in exported/00000001/assets/context.pbtxt
# for example, 
# from
# input {
#  name: "tag-map"
#  Part {
#    file_pattern: "syntaxnet/models/parsey_mcparseface/tag-map"
#  }
# }
# to
# input {
#  name: "tag-map"
#  Part {
#    file_pattern: "tag-map"
#  }
# }

# run parsey_api with exported model
$ ./bazel-bin/tensorflow_serving/example/parsey_api --port=9000 exported/00000001
bazhura commented 7 years ago

@dsindex that's an idea, will try without patch and come back with a result. thank you!

dsindex commented 7 years ago

@bazhura

unfortunately, i got the same error.

$ cp -rf /other_path/models trained_models
$ cat trained_models/context.pbtxt.template | sed "s=OUTPATH=/Users/donghwon/Desktop/develop/models/syntaxnet/work/serving/trained_models=" > trained_models/context.pbtxt
$ bazel-bin/tensorflow_serving/example/parsey_mcparseface --model_dir=trained_models --export_path=exported
I external/syntaxnet/syntaxnet/term_frequency_map.cc:101] Loaded 49 terms from /Users/donghwon/Desktop/develop/models/syntaxnet/work/serving/trained_models/label-map.
I external/syntaxnet/syntaxnet/embedding_feature_extractor.cc:35] Features:
I external/syntaxnet/syntaxnet/embedding_feature_extractor.cc:36] Embedding names:
I external/syntaxnet/syntaxnet/embedding_feature_extractor.cc:37] Embedding dims:
I external/syntaxnet/syntaxnet/term_frequency_map.cc:101] Loaded 49 terms from /Users/donghwon/Desktop/develop/models/syntaxnet/work/serving/trained_models/label-map.
I external/syntaxnet/syntaxnet/embedding_feature_extractor.cc:35] Features: input.word input(1).word input(2).word input(3).word stack.word stack(1).word stack(2).word stack(3).word stack.child(1).word stack.child(1).sibling(-1).word stack.child(-1).word stack.child(-1).sibling(1).word stack(1).child(1).word stack(1).child(1).sibling(-1).word stack(1).child(-1).word stack(1).child(-1).sibling(1).word stack.child(2).word stack.child(-2).word stack(1).child(2).word stack(1).child(-2).word;input.tag input(1).tag input(2).tag input(3).tag stack.tag stack(1).tag stack(2).tag stack(3).tag stack.child(1).tag stack.child(1).sibling(-1).tag stack.child(-1).tag stack.child(-1).sibling(1).tag stack(1).child(1).tag stack(1).child(1).sibling(-1).tag stack(1).child(-1).tag stack(1).child(-1).sibling(1).tag stack.child(2).tag stack.child(-2).tag stack(1).child(2).tag stack(1).child(-2).tag;stack.child(1).label stack.child(1).sibling(-1).label stack.child(-1).label stack.child(-1).sibling(1).label stack(1).child(1).label stack(1).child(1).sibling(-1).label stack(1).child(-1).label stack(1).child(-1).sibling(1).label stack.child(2).label stack.child(-2).label stack(1).child(2).label stack(1).child(-2).label
I external/syntaxnet/syntaxnet/embedding_feature_extractor.cc:36] Embedding names: words;tags;labels
I external/syntaxnet/syntaxnet/embedding_feature_extractor.cc:37] Embedding dims: 64;32;32
I external/syntaxnet/syntaxnet/term_frequency_map.cc:101] Loaded 18752 terms from /Users/donghwon/Desktop/develop/models/syntaxnet/work/serving/trained_models/word-map.
I external/syntaxnet/syntaxnet/term_frequency_map.cc:101] Loaded 50 terms from /Users/donghwon/Desktop/develop/models/syntaxnet/work/serving/trained_models/tag-map.
Traceback (most recent call last):
  File "/Users/donghwon/Desktop/develop/models/syntaxnet/work/serving/bazel-bin/tensorflow_serving/example/parsey_mcparseface.runfiles/tf_serving/tensorflow_serving/example/parsey_mcparseface.py", line 188, in <module>
    tf.app.run()
  File "/Users/donghwon/Desktop/develop/models/syntaxnet/work/serving/bazel-bin/tensorflow_serving/example/parsey_mcparseface.runfiles/tf_serving/external/org_tensorflow/tensorflow/python/platform/app.py", line 30, in run
    sys.exit(main(sys.argv))
  File "/Users/donghwon/Desktop/develop/models/syntaxnet/work/serving/bazel-bin/tensorflow_serving/example/parsey_mcparseface.runfiles/tf_serving/tensorflow_serving/example/parsey_mcparseface.py", line 172, in main
    model[prefix]["documents"] = Build(sess, source, model[prefix])
  File "/Users/donghwon/Desktop/develop/models/syntaxnet/work/serving/bazel-bin/tensorflow_serving/example/parsey_mcparseface.runfiles/tf_serving/tensorflow_serving/example/parsey_mcparseface.py", line 75, in Build
    document_source=document_source)
  File "/Users/donghwon/Desktop/develop/models/syntaxnet/work/serving/bazel-bin/tensorflow_serving/example/parsey_mcparseface.runfiles/tf_serving/external/syntaxnet/syntaxnet/structured_graph_builder.py", line 242, in AddEvaluation
    document_source=document_source))
  File "/Users/donghwon/Desktop/develop/models/syntaxnet/work/serving/bazel-bin/tensorflow_serving/example/parsey_mcparseface.runfiles/tf_serving/external/syntaxnet/syntaxnet/structured_graph_builder.py", line 100, in _AddBeamReader
    documents_from_input=documents_from_input)
  File "/Users/donghwon/Desktop/develop/models/syntaxnet/work/serving/bazel-bin/tensorflow_serving/example/parsey_mcparseface.runfiles/tf_serving/external/syntaxnet/syntaxnet/ops/gen_parser_ops.py", line 100, in beam_parse_reader
    name=name)
  File "/Users/donghwon/Desktop/develop/models/syntaxnet/work/serving/bazel-bin/tensorflow_serving/example/parsey_mcparseface.runfiles/tf_serving/external/org_tensorflow/tensorflow/python/framework/op_def_library.py", line 627, in apply_op
    (key, op_type_name, attr_value.i, attr_def.minimum))
ValueError: Attr 'feature_size' of 'BeamParseReader' Op passed 0 less than minimum 1.
dsindex commented 7 years ago

i found differences b/w a parsey and a trainded model. Parsey uses 'brain_tagger' prefix but 'train.sh' uses 'brain_pos' prefix. so, export module can't match it. i am going to fix it.

dsindex commented 7 years ago

@bazhura

after modifying 'brain_pos' to 'brain_tagger' in all related scripts and directories. it works well.

$ bazel-bin/tensorflow_serving/example/parsey_mcparseface --model_dir=trained_models --export_path=exported
I external/syntaxnet/syntaxnet/term_frequency_map.cc:101] Loaded 49 terms from /Users/donghwon/Desktop/develop/serving/tf_models/syntaxnet/work/models/label-map.
I external/syntaxnet/syntaxnet/embedding_feature_extractor.cc:35] Features: stack(3).word stack(2).word stack(1).word stack.word input.word input(1).word input(2).word input(3).word; input.digit input.hyphen; stack.suffix(length=2) input.suffix(length=2) input(1).suffix(length=2); stack.prefix(length=2) input.prefix(length=2) input(1).prefix(length=2)
I external/syntaxnet/syntaxnet/embedding_feature_extractor.cc:36] Embedding names: words;other;suffix;prefix
I external/syntaxnet/syntaxnet/embedding_feature_extractor.cc:37] Embedding dims: 64;4;8;8
I external/syntaxnet/syntaxnet/term_frequency_map.cc:101] Loaded 18752 terms from /Users/donghwon/Desktop/develop/serving/tf_models/syntaxnet/work/models/word-map.
I external/syntaxnet/syntaxnet/term_frequency_map.cc:101] Loaded 50 terms from /Users/donghwon/Desktop/develop/serving/tf_models/syntaxnet/work/models/tag-map.
I external/syntaxnet/syntaxnet/term_frequency_map.cc:101] Loaded 49 terms from /Users/donghwon/Desktop/develop/serving/tf_models/syntaxnet/work/models/label-map.
I external/syntaxnet/syntaxnet/embedding_feature_extractor.cc:35] Features: input.word input(1).word input(2).word input(3).word stack.word stack(1).word stack(2).word stack(3).word stack.child(1).word stack.child(1).sibling(-1).word stack.child(-1).word stack.child(-1).sibling(1).word stack(1).child(1).word stack(1).child(1).sibling(-1).word stack(1).child(-1).word stack(1).child(-1).sibling(1).word stack.child(2).word stack.child(-2).word stack(1).child(2).word stack(1).child(-2).word; input.tag input(1).tag input(2).tag input(3).tag stack.tag stack(1).tag stack(2).tag stack(3).tag stack.child(1).tag stack.child(1).sibling(-1).tag stack.child(-1).tag stack.child(-1).sibling(1).tag stack(1).child(1).tag stack(1).child(1).sibling(-1).tag stack(1).child(-1).tag stack(1).child(-1).sibling(1).tag stack.child(2).tag stack.child(-2).tag stack(1).child(2).tag stack(1).child(-2).tag; stack.child(1).label stack.child(1).sibling(-1).label stack.child(-1).label stack.child(-1).sibling(1).label stack(1).child(1).label stack(1).child(1).sibling(-1).label stack(1).child(-1).label stack(1).child(-1).sibling(1).label stack.child(2).label stack.child(-2).label stack(1).child(2).label stack(1).child(-2).label
I external/syntaxnet/syntaxnet/embedding_feature_extractor.cc:36] Embedding names: words;tags;labels
I external/syntaxnet/syntaxnet/embedding_feature_extractor.cc:37] Embedding dims: 64;32;32
I external/syntaxnet/syntaxnet/term_frequency_map.cc:101] Loaded 18752 terms from /Users/donghwon/Desktop/develop/serving/tf_models/syntaxnet/work/models/word-map.
I external/syntaxnet/syntaxnet/term_frequency_map.cc:101] Loaded 50 terms from /Users/donghwon/Desktop/develop/serving/tf_models/syntaxnet/work/models/tag-map.
INFO:tensorflow:Exporting trained model to exported
INFO:tensorflow:Write assest into: exported/00000001-tmp/assets using gfile_copy.
INFO:tensorflow:Copying asset trained_models/context to path exported/00000001-tmp/assets/context.
INFO:tensorflow:Copying asset trained_models/parser-params.meta to path exported/00000001-tmp/assets/parser-params.meta.
INFO:tensorflow:Copying asset trained_models/context.pbtxt.template to path exported/00000001-tmp/assets/context.pbtxt.template.
INFO:tensorflow:Copying asset trained_models/label-map to path exported/00000001-tmp/assets/label-map.
INFO:tensorflow:Copying asset trained_models/prefix-table to path exported/00000001-tmp/assets/prefix-table.
INFO:tensorflow:Copying asset trained_models/lcword-map to path exported/00000001-tmp/assets/lcword-map.
INFO:tensorflow:Copying asset trained_models/parser-params to path exported/00000001-tmp/assets/parser-params.
INFO:tensorflow:Copying asset trained_models/category-map to path exported/00000001-tmp/assets/category-map.
INFO:tensorflow:Copying asset trained_models/tag-to-category to path exported/00000001-tmp/assets/tag-to-category.
INFO:tensorflow:Copying asset trained_models/word-map to path exported/00000001-tmp/assets/word-map.
INFO:tensorflow:Copying asset trained_models/tag-map to path exported/00000001-tmp/assets/tag-map.
INFO:tensorflow:Copying asset trained_models/tagger-params to path exported/00000001-tmp/assets/tagger-params.
INFO:tensorflow:Copying asset trained_models/tagger-params.meta to path exported/00000001-tmp/assets/tagger-params.meta.
INFO:tensorflow:Copying asset trained_models/suffix-table to path exported/00000001-tmp/assets/suffix-table.
INFO:tensorflow:Copying asset trained_models/context.pbtxt to path exported/00000001-tmp/assets/context.pbtxt.
$ ./bazel-bin/tensorflow_serving/example/parsey_api --port=9000 exported/00000001
$ bazel-bin/tensorflow_serving/example/parsey_client --server=localhost:9000
hello syntaxnet
result {
  docid: "-:0"
  text: "hello syntaxnet"
  token {
    word: "hello"
    start: 0
    end: 4
    head: 1
    tag: "UH"
    category: "UH"
    label: "amod"
  }
  token {
    word: "syntaxnet"
    start: 5
    end: 13
    tag: "NN"
    category: "NN"
    label: "ROOT"
  }
}

Input :  hello syntaxnet
Parsing :
{"result": [{"text": "hello syntaxnet", "token": [{"category": "UH", "head": 1, "end": 4, "label": "amod", "start": 0, "tag": "UH", "word": "hello"}, {"category": "NN", "end": 13, "label": "ROOT", "start": 5, "tag": "NN", "word": "syntaxnet"}], "docid": "-:0"}]}
bazhura commented 7 years ago

Hello, I have trained the new model with xcopy_model(): change name to copy_model() in train.sh (and remove the original copy_model() function)

Also, build syntaxnet as before, but cloning the very recent version of your repo. Still cannot export:

raise ValueError("Restore called with invalid save path %s" % save_path) ValueError: Restore called with invalid save path ../models/tagger-params

Did you get the results with or without patching? My steps:

alina@machine:~/work$ cat models/context.pbtxt.template | sed "s=OUTPATH=/home/alina/work/UD_Croatian/tmp/syntaxnet-output/brain_tagger/greedy/64-0.08-3600-0.9-0=" > models/context.pbtxt

alina@machine:~/work$ bazel-bin/tensorflow_serving/example/parsey_mcparseface --model_dir=models --export_path=exported bash: bazel-bin/tensorflow_serving/example/parsey_mcparseface: No such file or directory

alina@machine:~/work$ cd serving/

alina@machine:~/work/serving$ bazel-bin/tensorflow_serving/example/parsey_mcparseface --model_dir=../models --export_path=exported

I external/syntaxnet/syntaxnet/term_frequency_map.cc:101] Loaded 41 terms from /home/alina/work/UD_Croatian/tmp/syntaxnet-output/brain_tagger/greedy/64-0.08-3600-0.9-0/label-map. I external/syntaxnet/syntaxnet/embedding_feature_extractor.cc:35] Features: stack(3).word stack(2).word stack(1).word stack.word input.word input(1).word input(2).word input(3).word; input.digit input.hyphen; stack.suffix(length=2) input.suffix(length=2) input(1).suffix(length=2); stack.prefix(length=2) input.prefix(length=2) input(1).prefix(length=2) I external/syntaxnet/syntaxnet/embedding_feature_extractor.cc:36] Embedding names: words;other;suffix;prefix I external/syntaxnet/syntaxnet/embedding_feature_extractor.cc:37] Embedding dims: 64;4;8;8 I external/syntaxnet/syntaxnet/term_frequency_map.cc:101] Loaded 34340 terms from /home/alina/work/UD_Croatian/tmp/syntaxnet-output/brain_tagger/greedy/64-0.08-3600-0.9-0/word-map. I external/syntaxnet/syntaxnet/term_frequency_map.cc:101] Loaded 17 terms from /home/alina/work/UD_Croatian/tmp/syntaxnet-output/brain_tagger/greedy/64-0.08-3600-0.9-0/tag-map. I external/syntaxnet/syntaxnet/term_frequency_map.cc:101] Loaded 41 terms from /home/alina/work/UD_Croatian/tmp/syntaxnet-output/brain_tagger/greedy/64-0.08-3600-0.9-0/label-map. I external/syntaxnet/syntaxnet/embedding_feature_extractor.cc:35] Features: input.word input(1).word input(2).word input(3).word stack.word stack(1).word stack(2).word stack(3).word stack.child(1).word stack.child(1).sibling(-1).word stack.child(-1).word stack.child(-1).sibling(1).word stack(1).child(1).word stack(1).child(1).sibling(-1).word stack(1).child(-1).word stack(1).child(-1).sibling(1).word stack.child(2).word stack.child(-2).word stack(1).child(2).word stack(1).child(-2).word; input.tag input(1).tag input(2).tag input(3).tag stack.tag stack(1).tag stack(2).tag stack(3).tag stack.child(1).tag stack.child(1).sibling(-1).tag stack.child(-1).tag stack.child(-1).sibling(1).tag stack(1).child(1).tag stack(1).child(1).sibling(-1).tag stack(1).child(-1).tag stack(1).child(-1).sibling(1).tag stack.child(2).tag stack.child(-2).tag stack(1).child(2).tag stack(1).child(-2).tag; stack.child(1).label stack.child(1).sibling(-1).label stack.child(-1).label stack.child(-1).sibling(1).label stack(1).child(1).label stack(1).child(1).sibling(-1).label stack(1).child(-1).label stack(1).child(-1).sibling(1).label stack.child(2).label stack.child(-2).label stack(1).child(2).label stack(1).child(-2).label I external/syntaxnet/syntaxnet/embedding_feature_extractor.cc:36] Embedding names: words;tags;labels I external/syntaxnet/syntaxnet/embedding_feature_extractor.cc:37] Embedding dims: 64;32;32 I external/syntaxnet/syntaxnet/term_frequency_map.cc:101] Loaded 34340 terms from /home/alina/work/UD_Croatian/tmp/syntaxnet-output/brain_tagger/greedy/64-0.08-3600-0.9-0/word-map. I external/syntaxnet/syntaxnet/term_frequency_map.cc:101] Loaded 17 terms from /home/alina/work/UD_Croatian/tmp/syntaxnet-output/brain_tagger/greedy/64-0.08-3600-0.9-0/tag-map. Traceback (most recent call last): File "/home/alina/work/serving/bazel-bin/tensorflow_serving/example/parsey_mcparseface.runfiles/tf_serving/tensorflow_serving/example/parsey_mcparseface.py", line 188, in tf.app.run() File "/home/alina/work/serving/bazel-bin/tensorflow_serving/example/parsey_mcparseface.runfiles/tf_serving/external/org_tensorflow/tensorflow/python/platform/app.py", line 30, in run sys.exit(main(sys.argv)) File "/home/alina/work/serving/bazel-bin/tensorflow_serving/example/parsey_mcparseface.runfiles/tf_serving/tensorflow_serving/example/parsey_mcparseface.py", line 172, in main model[prefix]["documents"] = Build(sess, source, model[prefix]) File "/home/alina/work/serving/bazel-bin/tensorflow_serving/example/parsey_mcparseface.runfiles/tf_serving/tensorflow_serving/example/parsey_mcparseface.py", line 79, in Build parser.saver.restore(sess, model_path) File "/home/alina/work/serving/bazel-bin/tensorflow_serving/example/parsey_mcparseface.runfiles/tf_serving/external/org_tensorflow/tensorflow/python/training/saver.py", line 1127, in restore raise ValueError("Restore called with invalid save path %s" % save_path) ValueError: Restore called with invalid save path ../models/tagger-params

dsindex commented 7 years ago

i tested without patch. it seems that OUTPATH should be the path to 'modes' directory.

bazhura commented 7 years ago

Would insert UD_Language folder into the _work/serving/tfmodels/syntaxnet/syntaxnet/models be the right thing to do? Parsey_mcparseface folder is there. Thank you!

│ ├── UD_Language │ │ └── tmp │ │ └── syntaxnet-output │ │ ├── brain_parser │ │ │ ├── greedy │ │ │ │ └── 512x512-0.08-4400-0.85-4 │ │ │ └── structured │ │ │ └── 512x512-0.02-100-0.9-0 │ │ └── brain_tagger │ │ └── greedy │ │ └── 64-0.08-3600-0.9-0 │ │ └── {... label-map in here...}

dsindex commented 7 years ago

@bazhura

since parsey_mcparseface tries to read resources files from the paths specified in models/context/pbtxt, OUTPATH should be like bellow :

$ pwd
/path/to
$ cat models/context.pbtxt.template | sed "s=OUTPATH=/path/to/models=" > models/context.pbtxt

i am not sure whether the error you got is related. :)

bazhura commented 7 years ago

@dsindex thank you for staying with me, I didn't succeed this time either :( will close the issue.