hexiangnan / attentional_factorization_machine

TensorFlow Implementation of Attentional Factorization Machine
407 stars 153 forks source link

Difference between FM and AFM is not as pronounced as in the original paper #25

Open lee335 opened 4 years ago

lee335 commented 4 years ago

Hi, first at all thanks for sharing the code of your AFM implementation. We tried to replicate the results from the original paper. We executed the code exactly as was given here in this github repository.

However after 20 epochs of training the difference between FM and AFM for the MovieLens dataset was not as large as shown in Figure 5 in the paper. Here are the training results for FM and AFM:

In Figure 5 in the paper the RMSE of AFM after 20 epochs are below 0.45.

We also tried to reimplement the AFM model. With the help of the details given in this github we were already able to improve the performance of the AFM model significantly. Did we miss any other details or is the code in this github repository different from the code used in the paper?

We would be very thankful if you can help us since we currently try to include the AFM model into our production pipeline.

lee335 commented 4 years ago

As mentioned in one of the github issues, we also tried to set batch_size to 2048 and lamda_attention to 32 but with no success. FM and AFM are almost identical: FM RMSE is 0.47 and AFM RMSE is 0.471.

haoojin commented 3 years ago

As mentioned in one of the github issues, we also tried to set batch_size to 2048 and lamda_attention to 32 but with no success. FM and AFM are almost identical: FM RMSE is 0.47 and AFM RMSE is 0.471.

I had the same problem, how do you solve it finally?

lee335 commented 3 years ago

As mentioned in one of the github issues, we also tried to set batch_size to 2048 and lamda_attention to 32 but with no success. FM and AFM are almost identical: FM RMSE is 0.47 and AFM RMSE is 0.471.

I had the same problem, how do you solve it finally?

We haven't solved it yet. If you find the solution we would be very interested in it.