huggingface / lighteval

Lighteval is your all-in-one toolkit for evaluating LLMs across multiple backends
MIT License
845 stars 100 forks source link

[FT] Enable batched dataset_filter #322

Open chuandudx opened 2 months ago

chuandudx commented 2 months ago

Issue encountered

When defining a custom dataset_filter in custom LightevalTaskConfig (code), I wanted to specify a hf_filter which filters the dataset by language.

It seems that by default, we do not process the examples in batches:

More specifically, I tried to implement a vectorized filtering function, which did not work unless batched=True, however, it seems difficult to control this value.

My initial language filter was

def create_language_filter(target_language):
    def language_filter(examples):
            return [language == target_language for language in examples['language']]
    return language_filter

if dataset=dataset.filter(dataset_filter, batched=False), the dataset is actually not filtered by language during testing. When I ran dataset=dataset.filter(dataset_filter, batched=True), the filtering was successful.

Testing code is below. Maybe this is not representative of how the lighteval task runs?

        dataset_filter = create_language_filter(language)
        dataset=dataset.filter(dataset_filter, batched=True) # switch between False and True
        for i, sample in enumerate(islice(dataset, 5)):
            print(f"\nSample {i + 1}:")
            print(f"Language: {sample['language']}")
            print(f"Text: {sample['text'][:100]}...")

Therefore, I modified the function as follows, but the evaluation could be slower due to single example processing during filtering?

def create_language_filter(target_language):
    def language_filter(examples):
        if isinstance(examples['language'], list):
            return [language == target_language for language in examples['language']]
        else:
            return examples['language'] == target_language
    return language_filter

Solution/Feature

I am wondering if there is interest in:

  1. exposing this parameter to be easily configurable
  2. set batched to be default True
  3. whether there is another way to run the filtering such that this isn't an issue?

Thank you!