microsoft / onnxruntime

ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
https://onnxruntime.ai
MIT License
14.37k stars 2.88k forks source link

There is a problem: GetTensorMutableData #6516

Open williamlzw opened 3 years ago

williamlzw commented 3 years ago

One method creates the tensor, and the other method calls getTensorMutableData, but does not return the value of the tensor. I create tensor witd data{1.23,2.33},but getTensorMutableData return {4.1408e+09,9.43074e-43}; I developed the code with Volcano IDE and then called Visual Studio to compile it: http://www.voldp.com/voldev.html https://23f49e.link.yunpan.360.cn/lk/surl_yStR9BxUfCB#/-0


Code url:https://github.com/laizewei/errordemo The main file: project.rar -> vpkg_onnx.cpp,vcls_rg_Onnxzhll.h,vpkg_main.cpp build.rar -> Debug/onnx.exe

fs-eire commented 3 years ago

The following code in your example code does not work:

    std::vector<float> floatArray(count);
    status=g_ort->GetTensorMutableData(m_tensor,(void**)floatArray.data());

This is because API GetTensorMutableData does not copy data to a specified buffer. Instead, it modified the value of out to the correct address of the tensor's underlying buffer for read/write data from/to.

  /* from onnxruntime_c_api.h */

  // This function doesn't work with string tensor
  // this is a no-copy method whose pointer is only valid until the backing OrtValue is free'd.
  ORT_API2_STATUS(GetTensorMutableData, _Inout_ OrtValue* value, _Outptr_ void** out);

the correct way to call this API is:

    float *data;
    status=g_ort->GetTensorMutableData(m_tensor, &data);

you can access the data from data later:

//std::cout<<floatArray[index]<<","<<count<<std::endl;
std::cout<<data[index]<<","<<count<<std::endl;