However, I've observed that, in the first batch, the codes I got were uniformly distributed. In the second and following batches, the codes and features within codes_sparse and quantized_sparse were almost the same. The following codes were I got for the second batch
Note that the input features within sparse_feat are not alike, so I am wondering what could go wrong here? Am I not configuring the quantization procedure right?
Looking forward to your helpful suggestions. Appreciated in advance.
Hi there, I am trying to quantize my input feature
sparse_feat
with the following codes in my network.However, I've observed that, in the first batch, the codes I got were uniformly distributed. In the second and following batches, the codes and features within
codes_sparse
andquantized_sparse
were almost the same. The following codes were I got for the second batchNote that the input features within
sparse_feat
are not alike, so I am wondering what could go wrong here? Am I not configuring the quantization procedure right? Looking forward to your helpful suggestions. Appreciated in advance.