facebookresearch / dlrm

An implementation of a deep learning recommendation model (DLRM)
MIT License
3.71k stars 825 forks source link

Why does embedding vectors could have various dimensions? #382

Closed YoungsukKim12 closed 2 months ago

YoungsukKim12 commented 3 months ago

Hello, I am deeply interested in recommendation systems, especially DLRM. As far as I know, embedding vector dimensions vary from model to model, where vector dimension ranges from 64B to 512B, depending on the model configuration. Is there any reason for this wide range of dimensionality? I guess that this has to do with providing overall flexibility to the model configuration, as embedding table entries are increasing these days for better model quality. Increasing the vector dimension by more than 512B could introduce extra overhead to the model in this situation. Is this the main reason for various vector dimensions?