mohammadbashiri / fitgabor

A simple tool to find the Gabor filter that maximizes the output of your neuron model
10 stars 8 forks source link

Fitting receptive field of a real neuron #9

Open mariakesa opened 9 months ago

mariakesa commented 9 months ago

Hey!

Thanks so much for your work!

I am working on the Allen Brain Observatory dataset and I was wondering can I use your package to fit Gabors to receptive fields of real neurons? ie say I use linear regression to fit weights from images to neural activity and use that as a receptive field-- can I then fit a Gabor to this receptive field using your package?

Thanks again!

mohammadbashiri commented 9 months ago

Hi @mariakesa,

Thank you for your interest in using this repo!

Yes indeed you can - it all depends on the objective function. But before I suggest anything, let me make sure I understand the steps and goal here:

Is that correct?

If that is the case, then you can simply define the objective to be an MSE between the generated gabor and the weights from your model and that should do it. This would require a new trainer_fn:

def trainer_fn(gabor_generator, model_weights,
               epochs=20000, lr=5e-3,
               fixed_std=.01,
               save_rf_every_n_epoch=None,
               optimizer=optim.Adam):
    gabor_generator.apply_changes()

    optimizer = optimizer(gabor_generator.parameters(), lr=lr)

    lr_scheduler = optim.lr_scheduler.ReduceLROnPlateau(optimizer, mode='max', factor=0.1, patience=10)
    old_lr = lr
    lr_change_counter = 0

    pbar = trange(epochs, desc="Loss: {}".format(np.nan), leave=True)
    saved_rfs = []
    for epoch in pbar:

        current_lr = optimizer.state_dict()['param_groups'][0]['lr']
        if old_lr != current_lr:
            old_lr = current_lr
            lr_change_counter += 1

        if lr_change_counter > 3:
            break

        def closure():
            optimizer.zero_grad()

            # generate gabor
            gabor = gabor_generator()

            if fixed_std is not None:
                gabor_std = gabor.std()
                gabor_std_constrained = fixed_std * gabor / gabor_std

            else:
                gabor_std_constrained = gabor

            loss = (gabor_std_constrained - model_weights)**2).mean()

            loss.backward()

            return loss

        loss = optimizer.step(closure)

        pbar.set_description("Loss: {:.2f}".format(loss.item()))

        if save_rf_every_n_epoch is not None:
            gabor = gabor_generator()
            if (epoch % save_rf_every_n_epoch) == 0:
                saved_rfs.append(gabor.squeeze().cpu().data.numpy())

        lr_scheduler.step(-loss)

    gabor_generator.eval();
    return gabor_generator, saved_rfs

Note that this is very similar to the existing trainer_fn but the loss has changed to MSE between the generated gabor and model weights.

I hope this helps. I am also happy to discuss this further.

Also, if you'd like, you are more than welcome to contribute to this repo and add a trainer function (and a demo) to optimize a gabor that matches a given "image" (in your case that would be weights but one can look at them as an image still).

mariakesa commented 9 months ago

Hey, thanks for your quick reply^^

Indeed, you got me! That's exactly what I want to do.

I would be happy to contribute to the repo and work with you! I'm going to start by fitting some receptive fields (using RidgeRegression and if needed PCA on image sequences) on the Allen data and eyeballing to check if I see any Gabors. If I can't get decent receptive field fits for the Allen data, I can work on the Stringer Pachitariu V1 dataset on Figshare, where I know I can get Difference of Gaussian and Gabor receptive fields, because I've worked on it before:-)--> https://github.com/MouseLand/EnsemblePursuit/blob/master/sand_poster2019.pdf Anyway, I am interested in working on this-- I need this functionality for my analyses:-)

Thanks again and best, Maria

mohammadbashiri commented 9 months ago

Sounds like a plan! And thanks for sharing the poster - that's very cool! I would like to know more about it, but do I understand it (on a shallow level) correctly that you find neurons that exhibit correlated activity, and then look at their receptive fields? Is there a paper about this? I wonder if these RFs say anything about the image statistics, i.e. RFs of co-active neurons might reflect components in the image space that often appear together and are relevant for the subject. I am quite familiar with Carsen's and Marius's work and am always inspired by it!

I am also in general into clustering neurons based on their functional properties - my very first abstract in my PhD was actually about this (AREADNE Abstract)

Would be happy to chat further - let me know if I can be of any help.

Good luck and best wishes, Mohammad

mariakesa commented 9 months ago

Hey!

That's an interesting poster! So the idea is to do a non-linear transformation of the neurons before clustering? I would like to know more! That sound really cool-- it makes perfect sense to look a different space for clustering the neurons than the original space, which is what we did in our clustering algorithm:-D But perhaps the ideas can be combined?

EnsemblePursuit which the poster is available to try here (easy pip install and sklearn style API) https://github.com/MouseLand/EnsemblePursuit We didn't reach the publication stage. The idea of the algorithm is matrix factorization (a bit similar to Marius' spike sorting work), with a regularization term that thresholds the correlation so that each cluster of neurons either contains or doesn't contain a neuron (L0 regularization of PCA objective), you keep adding neurons into the cluster and recalculate their average as a prototype of the cluster and then subtract u@v.T for each cluster (u contains an all or nothing list of neurons, v the average timeseries of the neurons) from the data matrix. Each cluster will have a timeseries and you can fit a linear receptive field to it, which is what we did. Hope that explains it a bit, it's really not a difficult algorithm-- let me know if anything is unclear.

I'm actually currently working on predicting neural activity from Vision Transformer embeddings of neurons (ie p(neural activity|stimulus) replaced with p(neural activity|f(stimulus)). I'm doing a self-study master's with MIT OCW courses and books. I began in September. Matthew Kaufman kindly agreed to supervise me https://biologicalsciences.uchicago.edu/faculty/matthew-kaufman-phd The plan is to write a web page thesis (with Javascript interactivity) at the end of 2 years. I have Fridays off from work so I usually work on neuroscience Friday-Sunday. Let's keep in touch and I'll start fitting receptive fields on Friday:-)

Best, Maria

mohammadbashiri commented 9 months ago

Hey, sorry for the late reply.

The idea of the abstract is that: many neurons seem to apply similar functions on the image but at different locations and orientation (and potentially scale, etc.). Of course we can learn a complete different function for each neuron, but we can turn the thing around: let's transform the input image for each neuron such that the same function applies. And this abstract basically suggests one model that can do that. It's not computationally efficient unfortunately tho because you will need to copy the same image for the number of neurons you have.

Nice, that sounds similar to what I did during the PhD (using convolutional neural nets to predict responses). Sounds very exciting and I would be very happy to chat at some point (we can also talk about EnsemblePursuit).

Good luck with the master's! And do let me know any time you would like to chat or bounce some ideas off of each other :)

Best, Mo