Open daspk04 opened 8 months ago
Hi @Pratyush1991 ,
It looks like, yes... thanks for the information, I have to dig into that. If somebody would be kind enough to provide us a minimal working example, it would be so great.
Like
otbtf.dataset
Thanks @remicres!
I might be able do it. At-least based on current OTBTF version:
patches
! Actually tfrecords
should work as well as TorchData
supports it. c.fHi @daspk04 , I've pushed releases candidate images of OTBTF including TF 2.18 and OTB 9.1.
We still need to update some python scripts to Keras 3, but if you already wanna try something with otb apps you may use one of the following images :
I've made a lot of changes in the build process, and hope I didn't break anything in the TF install, thus any feedback would be much appreciated !
Cheers
Hi @vidlb !
Thanks for the upgrade. I gave it a try with the OTBTF tutorial.
There were few specific issues:
tf.*
related api to keras.ops
out_tensor._keras_history.layer
-> out_tensor._keras_history.operation
target
and output
must have the same shape. Received: target.shape=(8, 1, 1, 6), output.shape=(8, 1, 1, 64). I'm not sure exactly why this occurs, also my lack of experience with Keras but based on example for fcnn removing additional output worked.Working model example:
class SimpleCNNModel(otbtf.ModelBase):
"""" This is a subclass of `otbtf.ModelBase` to implement a CNN """
def normalize_inputs(self, inputs):
""" This function nomalizes the input, scaling values by 0.0001 """
return {inp_key: keras.ops.cast(inputs[inp_key], "float32") * 0.0001}
def get_outputs(self, normalized_inputs):
""" This function implements the model """
inp = normalized_inputs[inp_key]
net = conv(inp, 16, 5, "conv1") # 12x12x16
net = pool(net) # 6x6x16
net = conv(net, 32, 3, "conv2") # 4x4x32
net = pool(net) # 2x2x32
net = conv(net, 64, 2, "feats") # 1x1x32
net = conv(net, class_nb, 1, "classifier", None)
softmax_op = keras.layers.Softmax(name="softmax_layer")
estim = softmax_op(net)
return {
tgt_key: estim
}
Looks like keras 3 will give us a bit of extra work :)
A bit yes but it shouldn't be too hard.
Initially I started to update the code then thought my MR was already way too big, so I saved this in a patch file.
If you want to take a look, as pointed by Pratyush this is mostly stuff related to keras.ops, _keras_history, and some func arguments (e.g. keras.ops.one_hot
) :
I believe the most annoying change is that Keras now refuse to take a dict of named outputs, the name should be set / inferred in the layer props, but sometimes it seems it is lost due to optimizations
Hi @remicres !
The last week tensorflow released the 2.16.0-rc version. One interesting point is that Keras 3 will be the default version. Keras 3 seems quite interesting, it supports multi framework (tesorflow, pytorch, jax).
So then I'm assuming that we would be able to run directly any model written in pytorch with OTBTF as well?