microsoft / CNTK

Microsoft Cognitive Toolkit (CNTK), an open source deep-learning toolkit
https://docs.microsoft.com/cognitive-toolkit/
Other
17.52k stars 4.28k forks source link

C Wrapper for batch input #3070

Open AnupViAn opened 6 years ago

AnupViAn commented 6 years ago

Hello,

My trained model has an input of 24x32 grayscale image and 34 output classes. I am using this model in my C code. To do so i am using C wrapper functions (CNTK_LoadModel / CNTK_EvaluateSequence) that are provided in CNTKLibraryC.cpp. When I passed a single image to CNTK_EvaluateSequence, I got the output. But now I want to pass my input in a batch of 10 images, so do I need to change the function or is it possible with the current CNTK_EvaluateSequence function?

CNTK_StatusCode CNTK_EvaluateSequence(CNTK_ModelHandle model,
                                      const CNTK_Variable* inputs,         /* get from CNTK_GetModelArgumentsInfo*/
                                      const CNTK_Value* inputValues,    /* image pixel value in float */
                                      const bool* inputResetFlags,         /* false */
                                      uint32_t numInputs,                      /* 1 */
                                      const CNTK_Variable* outputs,     /* get from CNTK_GetModelOutputsInfo */
                                      uint32_t numOutputs,                  /* 1 */
                                      CNTK_Value** outputValues)

Any help would be really appreciated. Thanks

ke1337 commented 6 years ago

Just change your input value to shape 32 x 24 x batchSize, and output value would be 34 x batchSize. Notice that CNTK C/C# interface assumes data to be column major.

AnupViAn commented 6 years ago

@KeDengMS Batch input problem solved for me but now I am facing issue with heap memory. I am calling CNTK_EvaluateSequence function inside "for" loop and batch size is varying (increasing or decreasing (my application requirement)) for each iteration and I observed (in performance profiler VS2017) that heap allocation increases at each iteration. If I keep constant batch size then heap allocation becomes constant.

profiling

Can you please suggest me where I am doing wrong? Thanks

ke1337 commented 6 years ago

Please take a look at this answer.

AnupViAn commented 6 years ago

I tried As per your suggestion but still facing the same problem.

ke1337 commented 6 years ago

Which version of CNTK? There's a bug fix for MKL mem leak in 2.5. If you are running 2.5, please share a repro.

AnupViAn commented 6 years ago

I upgraded the version from 2.4 to 2.5 and now its working fine. Thanks for your time and support.

ahmadmahmoody commented 6 years ago

Hello, I see memory leak both for 2.4 and 2.5 with Java evaluation. Are the native Dlls for the CNTK.jar also updated? thanks