distillpub / post--differentiable-parameterizations

A powerful, under-explored tool for neural network visualizations and art.
https://distill.pub/2018/differentiable-parameterizations
Creative Commons Attribution 4.0 International
26 stars 12 forks source link

Sharpening xy2rgb section #27

Closed colah closed 6 years ago

colah commented 6 years ago

Initial revision of the first half:

image

colah commented 6 years ago

I'd also suggest shortening the section name from "Compositional Pattern Producing Networks as Differentiable Parameterization" to "Compositional Pattern Producing Networks." It feels unnecessarily long: the whole article is about parametrization

Nicola17 commented 6 years ago

+1 on the title

Here's my comments:

colah commented 6 years ago

Incorporated your feedback.

The parameters ... determine -> It seems that only the weight and the biases play a role, while the architecture is equally important.

I think the present version works? When people talk about normal networks, we often talk about the parameters determining its behavior. It's implicit that this is for a given architecture, but makes sense to focus on because we're talking about learning.

I'd rather add a new paragraph later on about the importance of the architecture. Ideally with examples of the same feature visualized using different architectures. :)

distinctive clean images -> I'm not sure that this is the main characteristic. I find the distortions and the out-of-focus effects even more interesting ^^

I just dropped "clean", but I would like to say something specific about the images. To me the main characteristic is that they generally consist of smooth gradients and sharp edges, with very little "texture". Some of them are out of focus, but many aren't. (This is consistent with the "computational complexity" interpretation.)