Closed bdqnghi closed 6 years ago
Yeah usually 2 is good enough. 50 is good when the initial accuracy is low, for instance if your initial dictionary is very weak. But if you start from an alignment that gives 50% P@1 or something, then very few iterations are needed.
I see, thanks :dango:
I wonder how one can choose the best number of refinement steps? Is it true to say that the more larger number of refinement steps, the better the results?
I did some experiments on my dataset, for example, I set n_refinement = 50, the best result is at iteration 2, not at iteration 50