Closed sachindesh closed 6 years ago
Hi, you don't need to install Caffe, it is automatically installed by DeepDetect when building it. You should not modify any header nor any file.
I would recommend you remove your external caffe build, and start from a fresh DeepDetect clone. Follow the buidling instruction closely and it will work fine.
### Configuration
#### Your question / the problem you're facing: Running the demo of deepdetect sentiment analysis pre-trained model (English) mentioned on page: https://www.deepdetect.com/applications/text_model/#list-of-character-based-text-classification-models:54b61be348fc21f37c7f0cc7489f0c78
1) INSTALLED CAFFE, but with errors: "The dependency target "pycaffe" of target "pytest" does not exist." "Target "caffe" INTERFACE_INCLUDE_DIRECTORIES property contains path: "/root/caffe/src/caffe/" which is prefixed in the source directory. " 2) Installed DeepDetect using BLAS=open and CPU_ONLY mode ON (faced lot of issues of header files and fixed them by setting proper header file paths)
3) Started dede server on HTTP port 8080. 4) sent a request to create service using curl 5) received errors on server and client as:
====ERROR SEEN ON CLIENT:==== {"status":{"code":500,"msg":"InternalError","dd_code":1007,"dd_msg":"src/caffe/util/upgrade_proto.cpp:88 / Check failed (custom): ReadProtoFromTextFile(param_file, param)"}}
====ERROR SEEN ON SERVER:==== [libprotobuf ERROR google/protobuf/text_format.cc:274] Error parsing text-format caffe.NetParameter: 1:7: Message type "caffe.NetParameter" has no field named "b0VIM". [2018-04-19 18:41:26.170] [sent_en] [error] Error creating network [2018-04-19 18:41:26.171] [sent_en] [error] service creation call failed [2018-04-19 18:41:26.171] [api] [error] 127.0.0.1 "PUT /services/sent_en" 500 0
====REQUEST USED===== curl -X PUT 'http://localhost:8080/services/sent_en' -d '{ "mllib":"caffe", "description":"English sentiment classification", "type":"supervised", "parameters":{ "input":{ "connector":"txt", "characters":true, "alphabet":"abcdefghijklmnopqrstuvwxyz0123456789,;.!?'\''", "sequence":140 }, "mllib":{ "nclasses":2 } }, "model":{ "repository":"/home/me/models/sent_en_char" } }'
ERROR seen in CAffe build ============ -- Caffe Configuration Summary -- General: -- Version : 1.0.0 -- Git : 1.0-108-g8645207-dirty -- System : Linux -- C++ compiler : /usr/bin/c++ -- Release CXX flags : -O3 -DNDEBUG -fPIC -Wall -Wno-sign-compare -Wno-uninitialized -- Debug CXX flags : -g -fPIC -Wall -Wno-sign-compare -Wno-uninitialized -- Build type : Release
-- BUILD_SHARED_LIBS : ON -- BUILD_python : ON -- BUILD_matlab : OFF -- BUILD_docs : ON -- CPU_ONLY : ON -- USE_OPENCV : ON -- USE_LEVELDB : ON -- USE_LMDB : ON -- USE_NCCL : OFF -- ALLOW_LMDB_NOLOCK : OFF
-- Dependencies: -- BLAS : Yes (open) -- Boost : Yes (ver. 1.58) -- glog : Yes -- gflags : Yes -- protobuf : Yes (ver. 2.6.1) -- lmdb : Yes (ver. 0.9.17) -- LevelDB : Yes (ver. 1.18) -- Snappy : Yes (ver. 1.1.3) -- OpenCV : Yes (ver. 2.4.9.1) -- CUDA : No
-- Documentaion: -- Doxygen : No -- config_file :
-- Install: -- Install path : /root/caffe/build/install
-- Configuring done CMake Warning (dev) in src/caffe/CMakeLists.txt: Policy CMP0022 is not set: INTERFACE_LINK_LIBRARIES defines the link interface. Run "cmake --help-policy CMP0022" for policy details. Use the cmake_policy command to set the policy and suppress this warning.
Target "caffe" has an INTERFACE_LINK_LIBRARIES property which differs from its LINK_INTERFACE_LIBRARIES properties.
INTERFACE_LINK_LIBRARIES:
LINK_INTERFACE_LIBRARIES:
This warning is for project developers. Use -Wno-dev to suppress it.
CMake Error at CMakeLists.txt:104 (add_dependencies): The dependency target "pycaffe" of target "pytest" does not exist.
CMake Error in src/caffe/CMakeLists.txt: Target "caffe" INTERFACE_INCLUDE_DIRECTORIES property contains path:
which is prefixed in the source directory.
-- Generating done -- Build files have been written to: /root/caffe/build
Request your help in solving errors as I want to run the Sentiment Analysis model and further use for training with more data for my learning purpose.