An implementation of a deep learning recommendation model (DLRM)
MIT License
3.71k
stars
825
forks
source link
RuntimeError: [enforce fail at embedding_lookup_idx.cc:215] current == index_size. 0 vs -1. Your input seems to be incorrect: the sum of lengths values should be the size of the indices tensor, but it appears not. #352
But I'm running here.::
embeddings = F.embedding_bag(input_hbm, self.cache_weight_mgr.hbm_cached_weight, offsets, self.max_norm,self.norm_type, self.scale_grad_by_freq, self.mode, self.sparse,per_sample_weights, self.include_last_offset, self.padding_idx)
bug appear:
line 2392, in embeddingbag
ret, , , = torch.embedding_bag(
RuntimeError: [enforce fail at embedding_lookup_idx.cc:215] current == index_size. 0 vs -1. Your input seems to be incorrect: the sum of lengths values should be the size of the indices tensor, but it appears not.
But I'm running here.:: embeddings = F.embedding_bag(input_hbm, self.cache_weight_mgr.hbm_cached_weight, offsets, self.max_norm,self.norm_type, self.scale_grad_by_freq, self.mode, self.sparse,per_sample_weights, self.include_last_offset, self.padding_idx) bug appear:
line 2392, in embeddingbag ret, , , = torch.embedding_bag( RuntimeError: [enforce fail at embedding_lookup_idx.cc:215] current == index_size. 0 vs -1. Your input seems to be incorrect: the sum of lengths values should be the size of the indices tensor, but it appears not.
What is the reason and how to solve it?