dpfried / incoder

Generative model for code infilling and synthesis
295 stars 24 forks source link

Use InCoder for semantic code search #2

Closed Dumbris closed 2 years ago

Dumbris commented 2 years ago

Did you try to use the InCoder model to encode source code into a dense vector (embeddings) for semantic code search?

If it's possible, which layer, outputs better to use for that?

Thank you!

dpfried commented 2 years ago

We haven't tried this, but it'd be interesting to see results.

WIthout having tried it though, I might expect that all else being equal (training data, etc.) it would be better to use a model that uses bidirectional attention over the input (rather than InCoder's causal attention), e.g. CodeBERT, and possibly also pre-trained with a denoising objectie, e.g. CodeT5 or PLBART.

You might also look at what models have done well on the code search subset of CodeXGLUE (if you haven't already).

My intuition is that causal attentive models like ours are better for zero-shot generation, but models with bidirectional attention are better for producing representations. I think there's some exploration related to this in Wang et al. 2022.

Dumbris commented 2 years ago

Thank you so much for the references, a good starting point for this task.