greentfrapp / lucent

Lucid library adapted for PyTorch
Apache License 2.0
597 stars 89 forks source link

Utils to show modulename with its repr(); Add Linear weighted activations as objective; Add pretrained GAN as parametrization #5

Closed Animadversio closed 4 years ago

Animadversio commented 4 years ago

Dear author,

Thanks so much for implement lucid in PyTorch! I really enjoyed using it in my projects of leveraging deep neural networks as a way to understand real neurons in visual cortices. In my usage, I want to activate multiple channels together to match the selectivity of the biological neuron or units in other networks. We can achieve this by adding up the original channel objective or neuron objective. But it becomes very inefficient in back prop.

So here are my 2 cents, in this commit I

greentfrapp commented 4 years ago

@Animadversio thanks for the contribution! And I'm glad Lucent helped with your work.

The linearly weighted activations parts look good. Can you add some tests in tests/optvis/test_objectives.py for the new objective functions? You can run the tests with

$ coverage run --source . --omit setup.py -m pytest
$ coverage report -m

Regarding the new lucent_layernames function in util.py, it's actually already there, under get_model_layers in lucent/modelzoo/util.py https://github.com/greentfrapp/lucent/blob/044317a7b395220e6a27fd890c35abc081c5d1c8/lucent/modelzoo/util.py#L21 Although I just realized that I didn't add the following snippet!

if layer is None:
    # e.g. GoogLeNet's aux1 and aux2 layers
    continue

Can you make this change in the modelzoo/util.py file instead? Thanks!!!

Animadversio commented 4 years ago

Sure! Thanks for telling me how to run tests! Haven't tried that before

Animadversio commented 4 years ago

@greentfrapp Hi! I add tests for the new objectives. Besides, I add a new kind of parametrization: Use pre-trained GAN as prior to visualize features. I have added tests for these new functionalities!

The method is inspired by Nguyen, A., Dosovitskiy, A., Yosinski, J., Brox, T., & Clune, J. Synthesizing the preferred inputs for neurons in neural networks via deep generator networks.(2016) NIPS I have translated it into PyTorch and host the weights of the pre-trained GAN in my personal space.

The GANs are originally shared at https://lmb.informatik.uni-freiburg.de/people/dosovits/code.html The results are quite impressive for higher-level concept neurons. Below are samples of using fc8, fc7, fc6 GAN as parametrization to visualize the Lipstick neuron in VGG.

vgg16-fc-ch0629-fc8GAN-tfmdiv2 0-0113 vgg16-fc-ch0629-fc7GAN-tfmdiv2 0-0658 vgg16-fc-ch0629-fc6GAN-tfmdiv2 0-0219

greentfrapp commented 4 years ago

Thanks @Animadversio! The GAN-as-prior work looks interesting and intuitively will work better for higher-level layers compared to lower-level ones, due to the way the GAN is trained. I will be merging this for now, but I think it'll be helpful if you could also add a notebook to demonstrate the use of the GAN prior!