Dobiasd / frugally-deep

A lightweight header-only library for using Keras (TensorFlow) models in C++.
MIT License
1.07k stars 235 forks source link

Shape5 Documentation #162

Closed Cblakslee closed 5 years ago

Cblakslee commented 5 years ago

I am trying to use frugally deep to convert a fully trained convolutional nerual network from Python to C++. I got the example from the README to work and am now trying to apply this package to our CNN.

In the example, to my understanding, the first parameter of the fdeep::shape5 object in model.predict() defines the dimensions of the input. The example keras model has a 1-D input shape of (4,) which is why the first parameter of the shape5 is (1,1,1,1,4). For our model, since our input shape is (4096,2,1), I believe our first parameter to the fdeep:shape5 object would be (1,1,4096,2,1). For the second parameter of the fdeep::shape5 object though, I am quite lost as to what that is representing and how the parameter should be determined.

I would like to ask if I am on the right track with my understanding of the example. Also, is there some place where I could find documentation on how to correctly utilize the frugally deep package and all of its objects? Basically, I am just trying to understand what I should provide as parameters to model.predict() for my CNN with an input shape of (4096,2,1) and how I should go about creating/formatting those parameters.

Dobiasd commented 5 years ago

Hi,

your understanding of fdeep::shape5 seems correct to me.

The second argument given to fdeep::tensor5 in the mininal example you are referring to is a vector of values to fill the tensor of the given shape with.

In the FAQ you can find more examples on how to work with tensor values. Especially the following might be helpful: https://github.com/Dobiasd/frugally-deep/blob/master/FAQ.md#how-to-fill-an-fdeeptensor5-with-values-eg-from-an-stdvectorfloat

Dobiasd commented 5 years ago

@Cblakslee Did this help you solve the problem?

Dobiasd commented 5 years ago

I'll close this issue for now. If the problem still persists, feel free to re-open and let me know where I can help. :slightly_smiling_face: