Closed dwSun closed 6 years ago
@martinwicke can you comment or redirect? Thanks!
It is possible. Many ops already support higher rank tensors. If you run into problems with some who don't, for those for which such an extension is straightforward it is often enough to add an op registration for a higher rank (see for example the registrations for tf.reverse up to rank 8: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/core/kernels/reverse_op.cc#L190).
The main drawback for registering more kernels is higher build times and larger binaries. I encourage you to work on a 5D flow and add op registrations and ops as you need it. We'd be happy to add such changes.
I will close this issue, but if you find specific ops which need added kernels, or if you identify other issues, please open another more specific issue.
Presently, we are use 4-dim data flow to build most of our models, BatchSize x W x H x Channels or BatchSize x Channels x H x W.
Is it possible to build 5-dim data flow? Maybe BatchSize x Capsule x W x H x Channels, which should be useful in ideas like CapsNet for that we don't have to reshape out data. So we can build high dimensions models which are more complicated and should be better both in structure and data extraction performance.
According to MobileNetV2, performance of relu is better in high dimensions than in low dimensions. So it maybe even more better if we can build our entire model with high dimensions operations. Further more, how about 6-dim or 7-dim operations, even x-dim operations?
Is this kind of operations possible? If possible, how hard will it be to implement them?