april-tools / cirkit

a python framework to build, learn and reason about probabilistic circuits and tensor networks
https://cirkit-docs.readthedocs.io/en/latest/
GNU General Public License v3.0
71 stars 1 forks source link

Let channels be just variables everywhere #236

Open loreloc opened 3 months ago

loreloc commented 3 months ago

Is there any reason to not merge the channel and variables dimension in input layers and tensors?

lkct commented 3 weeks ago

I guess it's originally because of the structure in data, e.g. images, channel and width/height are treated differently, e.g. in QuadTree we only split the spatial dimensions, and make "variables" only refer to the pixels makes it easy. (Nevertheless it complicates things in other places.)

I'm also in favour of removing channel and using just variable, and I don't see a reason we should not do so.

andreasgrv commented 2 weeks ago

+1, when trying to understand input layers I was confused about the role of num_channels, as I was unsure if this is another way of overparametrising. Keeping some notes below in case useful in future.

It seems like num_channels and num_inputs belong together conceptually - e.g. could be grouped in a variable "input_shape". Or even more generally, maybe num_inputs and num_outputs could be grouped into a shape variable, where the last dimension is the number of outputs.

Caveat: I do not know how this is used deeper in the lib, I assume it probably would break a lot of things, mentioning here in case helpful when considering alternatives.