Unbabel / COMET

A Neural Framework for MT Evaluation
https://unbabel.github.io/COMET/html/index.html
Apache License 2.0
453 stars 72 forks source link

Using comet-mbr for Multi-Model Translation Ranking: Questions About Input Format and GPU Disabling #161

Open Aicha-cher opened 11 months ago

Aicha-cher commented 11 months ago

I'm attempting to utilize the following command: comet-mbr -s [SOURCE].txt -t [MT_SAMPLES].txt --num_sample [X] -o [OUTPUT_FILE].txt for the purpose of ranking multi-model translations. However, I have a couple of questions:

Could you provide clarification on the expected format of the [MT_SAMPLES].txt file? How should the distinct translations be structured within it? I tried disabling the GPU using the --gpus=0 option, but it appears to be ineffective. Is there a different approach I should take to achieve this?

ricardorei commented 11 months ago

Hey! There is a bit more information about data format here.

You are right, the MBR command is hardcoded to use a single GPU.

I assigned a bug label to this issue until its solved.

Aicha-cher commented 11 months ago

Thanks, @ricardorei! I followed the same approach a few days back. I locally commented out this line because the flag wasn't being considered in the code. Everything worked perfectly after that.

Aicha-cher commented 11 months ago

Another suggestion if possible it to add the data format to the readme.