keroro824 / HashingDeepLearning

Codebase for "SLIDE : In Defense of Smart Algorithms over Hardware Acceleration for Large-Scale Deep Learning Systems"
MIT License
1.07k stars 169 forks source link

Issues with Implementation and Replicating Paper Results #36

Open its-sandy opened 3 years ago

its-sandy commented 3 years ago

Hi there! I am working on a PyTorch implementation of SLIDE. I'm currently trying to compare its performance against SLIDE. I'm faced with a few doubts/issues while evaluating SLIDE, and need clarifications for the same.

  1. I'm unable to replicate the accuracy vs iteration plot for Delicious 200K dataset using the paramters Simhash, K=9, L=50 mentioned in the paper (plot attached). I also observe that SLIDE's accuracy seems to worsen beyond a certain point. What could be the reasons for these? git_issue
  2. I observe a few inconsistencies in the implementations of WTA and DWTA hashes.
  1. What is the reason behind using simhash for Delicious 200K and DWTA hash for Amazon 670K?
  2. The paper had mentioned extension of SLIDE to convolution as a future direction. Has there been any progress along this line?
Eslam2011 commented 2 years ago

@its-sandy Dear Mr. its-dandy,

I want ask you why you are work on PyTorch, and I want ask you if you get the same result in paper or not. I try to run the code along time and always give me Killed.

Also, Are you found weight and savedweight files because I can't find it. I very need to run the code .