Open zhifeng-huang opened 1 year ago
You can use these embeddings as the initial weights for your user embedding layer. An example as
embedding_layer = layers.Embedding(
input_dim=xxx,
output_dim=xxx,
weights=[your embedding],
trainable=True,
)
Is this compatible with the examples in the tutorials that use string Lookup? Can this be c ombined with pretrained embedding extractors (like BERT) dynamically, or only if all are extracted in advance? Can we combine that with the tutorial examples/e.g. retrieval task? Thanks!
@ddofer This is a different approach vs. the string lookup in the tutorial.
Great thanks! (I've added that to my retrieval model using strong, large (384 dim) pretrained embeddings. Oddly, results are the same (despite this being a sparse problem for users and items)
This question cannot be answer until you share more details. The reason can be related to multiple factors.
Hello,
I have a signature embedding of size 256 for each user. How can I use this vector as a feature in the user model?
Any tips are appreciated.