openvinotoolkit / openvino

OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference
https://docs.openvino.ai
Apache License 2.0
7.11k stars 2.23k forks source link

[Bug] Including <samples/ocv_common.hpp> failes in OpenVINO API 2.0 #16766

Closed adammpolak closed 1 year ago

adammpolak commented 1 year ago
System information (version)
Detailed description
/usr/share/openvino/samples/cpp/common/utils/include/samples/ocv_common.hpp:24:45: error: ‘InferenceEngine::Blob::Ptr’ has not been declared
   24 | void matU8ToBlob(const cv::Mat& orig_image, InferenceEngine::Blob::Ptr& blob, int batchIndex = 0) {
      |                                             ^~~~~~~~~~~~~~~
/usr/share/openvino/samples/cpp/common/utils/include/samples/ocv_common.hpp: In function ‘void matU8ToBlob(const cv::Mat&, int&, int)’:
/usr/share/openvino/samples/cpp/common/utils/include/samples/ocv_common.hpp:25:22: error: ‘SizeVector’ is not a member of ‘InferenceEngine’
   25 |     InferenceEngine::SizeVector blobSize = blob->getTensorDesc().getDims();
      |                      ^~~~~~~~~~
/usr/share/openvino/samples/cpp/common/utils/include/samples/ocv_common.hpp:26:26: error: ‘blobSize’ was not declared in this scope
   26 |     const size_t width = blobSize[3];
      |                          ^~~~~~~~
/usr/share/openvino/samples/cpp/common/utils/include/samples/ocv_common.hpp:29:22: error: ‘InferenceEngine::MemoryBlob’ has not been declared
   29 |     InferenceEngine::MemoryBlob::Ptr mblob = InferenceEngine::as<InferenceEngine::MemoryBlob>(blob);
Steps to reproduce
Issue submission checklist
Wovchena commented 1 year ago

That's strange. It must be something with your build configuration. You can verify that samples/ocv_common.hpp is valid by building unmodified samples.

You can try to simply remove these functions. If you use OpenVINO 2.0, you don't need them.

ilya-lavrenov commented 1 year ago

@Wovchena can we remove Blob-related code from samples? Samples are written in API 2.0 now and we don't can remove legacy

Wovchena commented 1 year ago

Ok. https://github.com/openvinotoolkit/openvino/pull/16787

adammpolak commented 1 year ago

Why don't we need them anymore? I appreciate OpenVINO doing their best to make the transition to 2.0 not that painful but I can't find some information.

The 2 docs I can find for transition is: https://docs.openvino.ai/latest/openvino_2_0_inference_pipeline.html#doxid-openvino-2-0-inference-pipeline https://docs.openvino.ai/latest/openvino_2_0_preprocessing.html#doxid-openvino-2-0-preprocessing

But neither of these explain how to do: network.getInputsInfo().size() wrapMat2Blob(image)

in the new api.

ilya-lavrenov commented 1 year ago

network.getInputsInfo().size() wrapMat2Blob(image)

It's not API, it's helper samples code.

adammpolak commented 1 year ago

Where is the equivalent helper sample code in API v2.0?

I use to perform this action:

    network.getOutputsInfo().empty()
    features_output_info = network.getOutputsInfo()["features"];
    heatmaps_output_info = network.getOutputsInfo()["heatmaps"];
    pafs_output_info = network.getOutputsInfo()["pafs"];

    features_output_info->setPrecision(Precision::FP32);
    heatmaps_output_info->setPrecision(Precision::FP32);
    pafs_output_info->setPrecision(Precision::FP32);

I can't find anywhere in any docs to do the equivalent.

I think a lot of people based their inference workflows off of hello_classifier.cpp from API v1.0.

Now with API v2.0 there is no 1:1 translation guide so I am unclear if I need to use ov::Tensor input_tensor = ov::Tensor(input_type, input_shape, input_data.get()); always from now on or if that is just one way to do it.

The demos are great, but now the youtube tutorial series is out of data so there is literally no step-by-step guide on how to prepare data input/output size/prevision for different layers. It doesn't exist.

The approach here: https://docs.openvino.ai/latest/openvino_2_0_inference_pipeline.html#doxid-openvino-2-0-inference-pipeline

Does not match with the samples at all. How am I supposed to make heads or tails of this?

If there is anywhere for 2.0 an explainer of the equivalent of input_info, getOutputsInfo, tensor vs. model, I can't find it.

ilya-lavrenov commented 1 year ago

I can't find anywhere in any docs to do the equivalent.

https://docs.openvino.ai/latest/openvino_2_0_preprocessing.html#converting-precision-and-layout

Now with API v2.0 there is no 1:1 translation guide

It's here https://docs.openvino.ai/latest/openvino_2_0_transition_guide.html

tensor vs. model

OpenVINO does not introduce model and tensor names, they are common in DL / AI domain.

adammpolak commented 1 year ago

@ilya-lavrenov I appreciate your efforts and thank you for responding