shenweichen / DeepCTR

Easy-to-use,Modular and Extendible package of deep-learning based CTR models .
https://deepctr-doc.readthedocs.io/en/latest/index.html
Apache License 2.0
7.55k stars 2.21k forks source link

Why is the parameter for the embedding size not present in case of fibinet? #164

Closed rajeshshrestha closed 1 year ago

rajeshshrestha commented 4 years ago

I don't see options to change the embedding size in fibinet. Operating environment(运行环境):

shenweichen commented 4 years ago

@Rajesh45npt hi,if yourdeepctr version is 0.7.0+(use pip list to check your version),the embedding size is set by specifying embedding_dim in SparseFeat or VarLenSparseFeat. https://github.com/shenweichen/DeepCTR/blob/db229dc31f0d4c79c0de2ece0bb919b35258d6b2/examples/run_classification_criteo.py#L28

rajeshshrestha commented 4 years ago

Thanks, @shenweichen but I think this functionality has been added in version 0.7.0. Is there some option to set the embedding to fibinet in case of v0.6.2.? Also, could you clarify the difference in the embedding size present in most of the models like NFFM and the embedding size passed to the SparseFeat?