Closed jmd-0 closed 5 years ago
Hi @gowthamkpr , any advice? Thank you for your help!
Hi, I tested in 2.0-rc0 (with Eager disabled) instead of 1.14 but intuitively the results should be similar.
In TF2, an issue would come from using input_mat.shape[0]
, which will return None, when you probably should use tf.shape(input_map)[0]
, which will return a dynamic scalar tensor pointing to inputs' actual batch size. But this might not be the case in 1.14, and at any rate, to answer your question, _in general, it is possible to handle dynamic batchsize, typically with a tf.while_loop (in your case, AutoGraph is generating one based on your code).
That being said, I think in your case the core failure reason is that you are attempting to feed a (symbolic) tensor to a scikit-learn object suited to use numpy arrays. This would perhaps work with Eager enabled on EagerTensors (which will implicitly be transformed back-and-forth into numpy arrays), but not (I think) on symbolic ones.
In my humble opinion, you probably should implement (or find) a tensorflow version of the KMeans algorithm and use it (or, possibly, find an alternative clustering layer that is easier to write and might have some learnable weights). At any rate, I hope that this helps a bit, and that you will find a suiting solution soon!
@jmd-0 Did @pandrey-fr answer solve your problem. Can I close the issue?
Hi @pandrey-fr, thank you for your response! I tried to experiment with the different TF functions you suggested and was unable to do exactly what I wanted, so it seems that I should just try to use the TF implementation of the KMeans algorithm, as you suggested.
Also, thank you for the interesting tidbit about how TF generates the symbolic tensors for unknown dataset sizes. I need to be aware of that in the future.
Thanks again, @pandrey-fr
You are very welcome ; good luck with the follow-up work :-)
I'm facing same problem. Any one solve this kind of problem ?
Any updates on this? I need a custom layer that does a for loop.
@Andreasksalk
Layer does not support iteration. you have to perform operation on entire batch
So it is a completely no go to use packages that is not keras? I have to perform optimal transport on a set of tensors in a keras model. i have found a way to do this on a single input but cannot seem to find a way to implement this into the keras model.
@Andreasksalk It depends on what you want to perform, and more specifically when you want it to happen. The overall issue is very simple: in order to train your model through backpropagation, tensorflow needs to be able to track what happens to the data, and hence to compute the gradients of the loss function relative to the model's trainable weights. But if you want to operate on your input data (before any trainable weights are being used), you should be able to do it - probably using a Lambda layer, and setting it to run eagerly.
I am trying to implement a custom Keras layer which does random shear
.
I am trying to use tf.keras.preprocessing.image.random_shear
which implements random_shear
per image.
So, I have to iterate over the tensor and call this method for each input.
However, the input is of the shape (None, 32,32,3)
, thus I can't know the number of rows. I tried to use tf.shape(inputs)[0]
but it did not help.
Any other way can we do this?
Hello @ashwanikumar04 ,
I think you are looking for tf.map_fn
(sheared = tf.map_fn(tf.keras.preprocessing.image.random_shear, inputs)
)
System information
Describe the current behavior I am attempting to build a custom TensorFlow Layer to perform K-Means clustering across channels of a given image. I am having difficulty creating this new layer to add to the model, as it seems that fundamentally, I don't have the ability to iterate over the batch size, which is unknown until runtime. I have tried a few alternatives such as the
@tf.function
function decorator and thetf.scan
function, which have both been unsuccessful.Describe the expected behavior I was expecting that since the batch size is unknown until runtime, that TensorFlow would be able to handle this error, similar to how TensorFlow can accept an unknown dimension and generate a matrix/tensor with the unknown shape.
Code to reproduce the issue
The error that I get from running the above code is as follows:
My main question is: can TensorFlow not handle iterating over an unknown batch size, or am I missing some functionality?
Thank you for the help!