Relative representations can be leveraged to enable solving tasks regarding "latent communication": from zero-shot model stitching to latent space comparison between diverse settings.
I have a question about the mechanism of stitching models with relative represenation.
When performing model stitching, if similarity vectors are created, are they normalized (with l2 norm) and communicated, or are they just connected?
Hi, thanks for providing such a wonderful work.
I have a question about the mechanism of stitching models with relative represenation. When performing model stitching, if similarity vectors are created, are they normalized (with l2 norm) and communicated, or are they just connected?
Thanks