NVIDIA-Merlin / dataloader

The merlin dataloader lets you rapidly load tabular data for training deep leaning models with TensorFlow, PyTorch or JAX
Apache License 2.0
408 stars 25 forks source link

Feed pre-trained embeddings to NVTabular #124

Open MelissaKR opened 2 years ago

MelissaKR commented 2 years ago

What is your question? I have a dataset that includes a column feature of pre-trained embeddings. I couldn't find any documentations or examples on how this column should be passed to NVTabular. Is it treated as a continuous feature?

rnyak commented 2 years ago

@MelissaKR thanks for the question. Can you tell us more about your use case?

Basically, if you want to do some feature pre-processing on the column of pre-trained embeddings, yes, you can feed them as continuous features to NVTabular.

Let us know if you have further questions.

MelissaKR commented 2 years ago

@rnyak Thank you for your response. I basically have another model that outputs embeddings for a given set of features, and I want to replace those features in the original model with the embeddings I have obtained. Should I simply pass these new feature columns as conts in TorchAsyncItr? It'll be great if I could see an example code of how pre-trained embeddings are passed to NVTabular's TorchAsyncItr.

viswa-nvidia commented 1 year ago

@rnyak , to follow up on this.

rnyak commented 1 year ago

@MelissaKR this issue was open for a while. do you mind giving a bit more detail about what you want to do with the embeddings you are getting from another model and what's your original model? We are currently supporting feeding embeddings to embedding layer you can see that tensorflow example. let us know if that's something you were looking for, or something else? thanks.

MelissaKR commented 1 year ago

@rnyak Thank you for getting back to me on this! I have a main model and in that model, let's say I have a feature for different movies. I can pass it as a regular categorical feature to then be fed to the embedding layer. My model uses PyTorch, by the way. But I have trained a different model that uses collaborative filtering which learns embeddings for these movies much better. So now, for each movie in the training and validation set for the main model, I have vectors of size n that are the learned embeddings. And I don't need to use this movie feature anymore and pass it to an embedding layer. Instead, I want to remove it from my dataset and use the learned embeddings from the second model, but I want to see if there is a straightforward way of doing this, instead of manually defining n new numeric features for each element in the new movie embeddings and pass them to NVTabular. In other words, how can I pass pre-trained embeddings as is to my model? I hope I could clarify my question and use case.

rnyak commented 1 year ago

@MelissaKR thanks for the clarification. we are currently working on that and we will be creating an example shortly. Example might not be on PyT but you can adapt it to your framework I believe :) Can you please tell me what's the architecture of your main model? it is an MLP model? or a more complicated architecture? Besides can you share a simple screenshot what would your data look like ? contains nested 3D arrays? or is it something like below?

movie_id movie_embedding 1 [float1, float2, ..., float64] 2 [float1, float2, ..., float64] .. n [float1, float2, ..., float64]

or more like this movie_id. movie_genres_id. movie_genres_embeddings 1 [1, 2, 3] [[float1, float2, ..., float64] , [float1, float2, ..., float64] , ...] 2 [3,5] [float1, float2, ..., float64] , [float1, float2, ..., float64] ]