aksnzhy / xlearn

High performance, easy-to-use, and scalable machine learning (ML) package, including linear model (LR), factorization machines (FM), and field-aware factorization machines (FFM) for Python and CLI interface.
https://xlearn-doc.readthedocs.io/en/latest/index.html
Apache License 2.0
3.08k stars 518 forks source link

FFM SetTXTModel forgets a delimiter between parameters #122

Open kcstokely opened 6 years ago

kcstokely commented 6 years ago

From the docs: "For FM and FFM, we store one vector of the latent factor in each line."

Right now this outputs:

0.156129 0.223862 0.114672 0.1005360.103618 0.100097 0.314356 0.0808605

This is an output of the model txt file for a FFM with 2 fields and 4 latent features. Notice there is a space missing between the 4 latent values for field #1 and the 4 latent values for field #2.

EDIT: This is using the Python API.

aksnzhy commented 6 years ago

@kcstokely Thank you so much. We will fix this problem.

aksnzhy commented 5 years ago

I update the model output like this, you need to git clone the source code and re-build it:

Linear:

bias: 0 i_0: 0 i_1: 0 i_2: 0 i_3: 0

FM:

bias: 0 i_0: 0 i_1: 0 i_2: 0 i_3: 0 v_0: 5.61937e-06 0.0212581 0.150338 0.222903 v_1: 0.241989 0.0474224 0.128744 0.0995021 v_2: 0.0657265 0.185878 0.0223869 0.140097 v_3: 0.145557 0.202392 0.14798 0.127928

FFM:

bias: 0 i_0: 0 i_1: 0 i_2: 0 i_3: 0 v_0_0: 5.61937e-06 0.0212581 0.150338 0.222903 v_0_1: 0.241989 0.0474224 0.128744 0.0995021 v_0_2: 0.0657265 0.185878 0.0223869 0.140097 v_0_3: 0.145557 0.202392 0.14798 0.127928 v_1_0: 0.219158 0.248771 0.181553 0.241653 v_1_1: 0.0742756 0.106513 0.224874 0.16325 v_1_2: 0.225384 0.240383 0.0411782 0.214497 v_1_3: 0.226711 0.0735065 0.234061 0.103661 v_2_0: 0.0771142 0.128723 0.0988574 0.197446 v_2_1: 0.172285 0.136068 0.148102 0.0234075 v_2_2: 0.152371 0.108065 0.149887 0.211232 v_2_3: 0.123096 0.193212 0.0179155 0.0479647 v_3_0: 0.055902 0.195092 0.0209918 0.0453358 v_3_1: 0.154174 0.144785 0.184828 0.0785329 v_3_2: 0.109711 0.102996 0.227222 0.248076 v_3_3: 0.144264 0.0409806 0.17463 0.083712