Closed elronbandel closed 3 weeks ago
The warning is printed out (for no obvious reason) only in the preparation step, when the classes are built up, but not in a civilian use of the classes:
[Unitxt|INFO|test_preparation.py:27] 2024-08-12 18:44:29,177 >> /home/dafna/workspaces/unitxt/prepare/**/*.py
[Unitxt|CRITICAL|test_preparation.py:28] 2024-08-12 18:44:29,177 >> Testing preparation files: ['/home/dafna/workspaces/unitxt/prepare/cards/sst2.py', '/home/dafna/workspaces/unitxt/prepare/cards/medical_abstracts.py', '/home/dafna/workspaces/unitxt/prepare/cards/mnli.py', '/home/dafna/workspaces/unitxt/prepare/cards/argument_topic.py', '/home/dafna/workspaces/unitxt/prepare/cards/winogrande.py', '/home/dafna/workspaces/unitxt/prepare/cards/wiki_bio.py', '/home/dafna/workspaces/unitxt/prepare/cards/tab_fact.py', '/home/dafna/workspaces/unitxt/prepare/cards/tablerow_classify.py', '/home/dafna/workspaces/unitxt/prepare/cards/wnli.py', '/home/dafna/workspaces/unitxt/prepare/cards/numeric_nlg.py', '/home/dafna/workspaces/unitxt/prepare/cards/law_stack_exchange.py', '/home/dafna/workspaces/unitxt/prepare/cards/mmlu.py', '/home/dafna/workspaces/unitxt/prepare/cards/bold.py', '/home/dafna/workspaces/unitxt/prepare/cards/trec.py', '/home/dafna/workspaces/unitxt/prepare/cards/xwinogrande.py', '/home/dafna/workspaces/unitxt/prepare/cards/dart.py', '/home/dafna/workspaces/unitxt/prepare/cards/legalbench.py', '/home/dafna/workspaces/unitxt/prepare/cards/billsum.py', '/home/dafna/workspaces/unitxt/prepare/cards/xnli.py', '/home/dafna/workspaces/unitxt/prepare/cards/openbookqa.py', '/home/dafna/workspaces/unitxt/prepare/cards/piqa.py', '/home/dafna/workspaces/unitxt/prepare/cards/head_qa.py', '/home/dafna/workspaces/unitxt/prepare/cards/claim_stance_topic.py', '/home/dafna/workspaces/unitxt/prepare/cards/belebele.py', '/home/dafna/workspaces/unitxt/prepare/cards/hellaswag.py', '/home/dafna/workspaces/unitxt/prepare/cards/sciq.py', '/home/dafna/workspaces/unitxt/prepare/cards/amazon_massive.py', '/home/dafna/workspaces/unitxt/prepare/cards/earnings_call.py', '/home/dafna/workspaces/unitxt/prepare/cards/human_eval.py', '/home/dafna/workspaces/unitxt/prepare/cards/hh_rlhf.py', '/home/dafna/workspaces/unitxt/prepare/cards/CFPB_product.py', '/home/dafna/workspaces/unitxt/prepare/cards/mlsum.py', '/home/dafna/workspaces/unitxt/prepare/cards/ledgar.py', '/home/dafna/workspaces/unitxt/prepare/cards/arc.py', '/home/dafna/workspaces/unitxt/prepare/cards/reward_bench.py', '/home/dafna/workspaces/unitxt/prepare/cards/mbpp.py', '/home/dafna/workspaces/unitxt/prepare/cards/babi.py', '/home/dafna/workspaces/unitxt/prepare/cards/multidoc2dial.py', '/home/dafna/workspaces/unitxt/prepare/cards/20newsgroups_sklearn.py', '/home/dafna/workspaces/unitxt/prepare/cards/wikilingua.py', '/home/dafna/workspaces/unitxt/prepare/cards/clapnq.py', '/home/dafna/workspaces/unitxt/prepare/cards/stsb.py', '/home/dafna/workspaces/unitxt/prepare/cards/toxigen.py', '/home/dafna/workspaces/unitxt/prepare/cards/clinc_oos.py', '/home/dafna/workspaces/unitxt/prepare/cards/summarize_from_human_feedback.py', '/home/dafna/workspaces/unitxt/prepare/cards/atis.py', '/home/dafna/workspaces/unitxt/prepare/cards/coqa.py', '/home/dafna/workspaces/unitxt/prepare/cards/wikitq.py', '/home/dafna/workspaces/unitxt/prepare/cards/unfair_tos.py', '/home/dafna/workspaces/unitxt/prepare/cards/ffqa_filtered.py', '/home/dafna/workspaces/unitxt/prepare/cards/copa.py', '/home/dafna/workspaces/unitxt/prepare/cards/financial_tweets.py', '/home/dafna/workspaces/unitxt/prepare/cards/coedit.py', '/home/dafna/workspaces/unitxt/prepare/cards/reuters21578.py', '/home/dafna/workspaces/unitxt/prepare/cards/almost_evil_ml_qa_mulitlingual.py', '/home/dafna/workspaces/unitxt/prepare/cards/financebench.py', '/home/dafna/workspaces/unitxt/prepare/cards/yahoo_answers_topics.py', '/home/dafna/workspaces/unitxt/prepare/cards/tldr.py', '/home/dafna/workspaces/unitxt/prepare/cards/mrpc.py', '/home/dafna/workspaces/unitxt/prepare/cards/news_category_classification_headline.py', '/home/dafna/workspaces/unitxt/prepare/cards/qqp.py', '/home/dafna/workspaces/unitxt/prepare/cards/rte.py', '/home/dafna/workspaces/unitxt/prepare/cards/universal_ner.py', '/home/dafna/workspaces/unitxt/prepare/cards/wsc.py', '/home/dafna/workspaces/unitxt/prepare/cards/cola.py', '/home/dafna/workspaces/unitxt/prepare/cards/race.py', '/home/dafna/workspaces/unitxt/prepare/cards/pop_qa.py', '/home/dafna/workspaces/unitxt/prepare/cards/language_identification.py', '/home/dafna/workspaces/unitxt/prepare/cards/ag_news.py', '/home/dafna/workspaces/unitxt/prepare/cards/ethos.py', '/home/dafna/workspaces/unitxt/prepare/cards/20_newsgroups.py', '/home/dafna/workspaces/unitxt/prepare/cards/xlsum.py', '/home/dafna/workspaces/unitxt/prepare/cards/go_emotions.py', '/home/dafna/workspaces/unitxt/prepare/cards/atta_q.py', '/home/dafna/workspaces/unitxt/prepare/cards/attaq_500.py', '/home/dafna/workspaces/unitxt/prepare/cards/mmlu_pro.py', '/home/dafna/workspaces/unitxt/prepare/cards/qnli.py', '/home/dafna/workspaces/unitxt/prepare/cards/xsum.py', '/home/dafna/workspaces/unitxt/prepare/cards/banking77.py', '/home/dafna/workspaces/unitxt/prepare/cards/almost_evil_ml_qa.py', '/home/dafna/workspaces/unitxt/prepare/cards/boolq.py', '/home/dafna/workspaces/unitxt/prepare/cards/cohere_for_ai.py', '/home/dafna/workspaces/unitxt/prepare/cards/fin_qa.py', '/home/dafna/workspaces/unitxt/prepare/cards/squad.py', '/home/dafna/workspaces/unitxt/prepare/cards/dbpedia_14.py', '/home/dafna/workspaces/unitxt/prepare/cards/cnn_dailymail.py', '/home/dafna/workspaces/unitxt/prepare/cards/mt_bench/common.py', '/home/dafna/workspaces/unitxt/prepare/cards/mt_bench/generation/japanese_single_turn.py', '/home/dafna/workspaces/unitxt/prepare/cards/mt_bench/generation/english_single_turn.py', '/home/dafna/workspaces/unitxt/prepare/cards/mt_bench/response_assessment/rating/single_turn_gpt4_judgement.py', '/home/dafna/workspaces/unitxt/prepare/cards/mt_bench/response_assessment/rating/multi_turn_gpt4_judgement.py', '/home/dafna/workspaces/unitxt/prepare/cards/mt_bench/response_assessment/rating/single_turn_with_reference_gpt4_judgement.py', '/home/dafna/workspaces/unitxt/prepare/cards/mt_bench/response_assessment/rating/multi_turn_with_reference_gpt4_judgement.py', '/home/dafna/workspaces/unitxt/prepare/cards/mt_bench/response_assessment/pairwise_comparison/single_turn_gpt4_judgement.py', '/home/dafna/workspaces/unitxt/prepare/cards/mt_bench/response_assessment/pairwise_comparison/multi_turn_gpt4_judgement.py', '/home/dafna/workspaces/unitxt/prepare/cards/mt_bench/response_assessment/pairwise_comparison/single_turn_with_reference_gpt4_judgement.py', '/home/dafna/workspaces/unitxt/prepare/cards/mt_bench/response_assessment/pairwise_comparison/multi_turn_with_reference_gpt4_judgement.py', '/home/dafna/workspaces/unitxt/prepare/cards/rag/end_to_end/clapnq.py', '/home/dafna/workspaces/unitxt/prepare/cards/rag/end_to_end/__init__.py', '/home/dafna/workspaces/unitxt/prepare/cards/dynamic_cards_for_llm_judges/llm_as_judge_metrics.py', '/home/dafna/workspaces/unitxt/prepare/cards/translation/flores101.py', '/home/dafna/workspaces/unitxt/prepare/cards/translation/wmt/en_ro.py', '/home/dafna/workspaces/unitxt/prepare/cards/translation/wmt/en_de.py', '/home/dafna/workspaces/unitxt/prepare/cards/translation/wmt/en_fr.py', '/home/dafna/workspaces/unitxt/prepare/cards/arena_hard/common.py', '/home/dafna/workspaces/unitxt/prepare/cards/arena_hard/generation/english_gpt-4-0314_reference.py', '/home/dafna/workspaces/unitxt/prepare/cards/arena_hard/response_assessment/pairwise_comparative_rating/both_games_gpt4_judge.py', '/home/dafna/workspaces/unitxt/prepare/cards/arena_hard/response_assessment/pairwise_comparative_rating/both_games_mean_judgment_gpt4_judge.py', '/home/dafna/workspaces/unitxt/prepare/cards/arena_hard/response_assessment/pairwise_comparative_rating/first_game_only_gpt4_judge.py', '/home/dafna/workspaces/unitxt/prepare/cards/safety/simple_safety_tests.py', '/home/dafna/workspaces/unitxt/prepare/cards/safety/bbq.py', '/home/dafna/workspaces/unitxt/prepare/cards/safety/truthful_qa.py', '/home/dafna/workspaces/unitxt/prepare/cards/safety/discrim_eval.py', '/home/dafna/workspaces/unitxt/prepare/system_prompts/tasks/boolqa.py', '/home/dafna/workspaces/unitxt/prepare/system_prompts/models/japanese_llama.py', '/home/dafna/workspaces/unitxt/prepare/system_prompts/models/llama.py', '/home/dafna/workspaces/unitxt/prepare/system_prompts/models/labradorite.py', '/home/dafna/workspaces/unitxt/prepare/system_prompts/models/empty.py', '/home/dafna/workspaces/unitxt/prepare/system_prompts/models/deepseek_coder.py', '/home/dafna/workspaces/unitxt/prepare/system_prompts/models/alpaca.py', '/home/dafna/workspaces/unitxt/prepare/system_prompts/models/llama2.py', '/home/dafna/workspaces/unitxt/prepare/augmentors/no_augmentation.py', '/home/dafna/workspaces/unitxt/prepare/augmentors/augment_whitespace_suffix_and_prefix.py', '/home/dafna/workspaces/unitxt/prepare/augmentors/augment_whitespace.py', '/home/dafna/workspaces/unitxt/prepare/templates/standard.py', '/home/dafna/workspaces/unitxt/prepare/templates/summarization/abstractive.py', '/home/dafna/workspaces/unitxt/prepare/templates/completion/abstractive.py', '/home/dafna/workspaces/unitxt/prepare/templates/completion/multiple_choice/templates.py', '/home/dafna/workspaces/unitxt/prepare/templates/rag/response_generation.py', '/home/dafna/workspaces/unitxt/prepare/templates/rag/end_to_end.py', '/home/dafna/workspaces/unitxt/prepare/templates/qa/with_context.py', '/home/dafna/workspaces/unitxt/prepare/templates/qa/open.py', '/home/dafna/workspaces/unitxt/prepare/templates/qa/multiple_choice/templates.py', '/home/dafna/workspaces/unitxt/prepare/templates/generation/generation.py', '/home/dafna/workspaces/unitxt/prepare/templates/sentiment_extraction/targeted_sentiment_extraction.py', '/home/dafna/workspaces/unitxt/prepare/templates/editing/grammatical_error_correction.py', '/home/dafna/workspaces/unitxt/prepare/templates/translation/directed.py', '/home/dafna/workspaces/unitxt/prepare/templates/language_identification/language_identification.py', '/home/dafna/workspaces/unitxt/prepare/templates/regression/two_texts.py', '/home/dafna/workspaces/unitxt/prepare/templates/regression/single_text.py', '/home/dafna/workspaces/unitxt/prepare/templates/regression/two_texts/similarity.py', '/home/dafna/workspaces/unitxt/prepare/templates/evaluation/preference.py', '/home/dafna/workspaces/unitxt/prepare/templates/safety/unsafe_content.py', '/home/dafna/workspaces/unitxt/prepare/templates/selection/by_attribute.py', '/home/dafna/workspaces/unitxt/prepare/templates/rewriting/by_attribute.py', '/home/dafna/workspaces/unitxt/prepare/templates/rewriting/paraphrase.py', '/home/dafna/workspaces/unitxt/prepare/templates/response_assessment/pairwise_comparative_rating/prometheus_arena_hard.py', '/home/dafna/workspaces/unitxt/prepare/templates/response_assessment/pairwise_comparative_rating/arena_hard.py', '/home/dafna/workspaces/unitxt/prepare/templates/response_assessment/rating/mt_bench_multi_turn.py', '/home/dafna/workspaces/unitxt/prepare/templates/response_assessment/rating/generic_single_turn.py', '/home/dafna/workspaces/unitxt/prepare/templates/response_assessment/rating/generic_single_turn_with_reference.py', '/home/dafna/workspaces/unitxt/prepare/templates/response_assessment/rating/mt_bench_multi_turn_with_reference.py', '/home/dafna/workspaces/unitxt/prepare/templates/response_assessment/rating/mt_bench_single_turn_with_reference.py', '/home/dafna/workspaces/unitxt/prepare/templates/response_assessment/rating/mt_bench_single_turn.py', '/home/dafna/workspaces/unitxt/prepare/templates/response_assessment/pairwise_comparison/mt_bench_multi_turn.py', '/home/dafna/workspaces/unitxt/prepare/templates/response_assessment/pairwise_comparison/mt_bench_multi_turn_with_reference.py', '/home/dafna/workspaces/unitxt/prepare/templates/response_assessment/pairwise_comparison/mt_bench_single_turn_with_reference.py', '/home/dafna/workspaces/unitxt/prepare/templates/response_assessment/pairwise_comparison/mt_bench_single_turn.py', '/home/dafna/workspaces/unitxt/prepare/templates/classification/classification.py', '/home/dafna/workspaces/unitxt/prepare/templates/classification/grammatical_error_detection.py', '/home/dafna/workspaces/unitxt/prepare/templates/classification/multi_class/relation.py', '/home/dafna/workspaces/unitxt/prepare/templates/classification/multi_class/relation/truthfulness.py', '/home/dafna/workspaces/unitxt/prepare/templates/span_labeling/templates.py', '/home/dafna/workspaces/unitxt/prepare/formats/user_agent.py', '/home/dafna/workspaces/unitxt/prepare/formats/textual_assistant.py', '/home/dafna/workspaces/unitxt/prepare/formats/empty.py', '/home/dafna/workspaces/unitxt/prepare/formats/human_assistant.py', '/home/dafna/workspaces/unitxt/prepare/formats/empty_input_output_separators.py', '/home/dafna/workspaces/unitxt/prepare/formats/user_assistant.py', '/home/dafna/workspaces/unitxt/prepare/formats/models/phi_3.py', '/home/dafna/workspaces/unitxt/prepare/formats/models/llama.py', '/home/dafna/workspaces/unitxt/prepare/formats/models/mistral.py', '/home/dafna/workspaces/unitxt/prepare/formats/models/starling.py', '/home/dafna/workspaces/unitxt/prepare/formats/models/labradorite.py', '/home/dafna/workspaces/unitxt/prepare/formats/models/llamaguard.py', '/home/dafna/workspaces/unitxt/prepare/formats/models/flan.py', '/home/dafna/workspaces/unitxt/prepare/formats/models/llama3.py', '/home/dafna/workspaces/unitxt/prepare/formats/models/deepseek_coder.py', '/home/dafna/workspaces/unitxt/prepare/formats/models/llama2.py', '/home/dafna/workspaces/unitxt/prepare/formats/models/alpaca_instruct.py', '/home/dafna/workspaces/unitxt/prepare/formats/models/flan/exq_exa.py', '/home/dafna/workspaces/unitxt/prepare/processors/to_string.py', '/home/dafna/workspaces/unitxt/prepare/processors/processors.py', '/home/dafna/workspaces/unitxt/prepare/processors/to_list_by_comma.py', '/home/dafna/workspaces/unitxt/prepare/processors/to_span_label_pairs.py', '/home/dafna/workspaces/unitxt/prepare/processors/to_list_by_hyphen.py', '/home/dafna/workspaces/unitxt/prepare/processors/safe_unsafe.py', '/home/dafna/workspaces/unitxt/prepare/tasks/rewriting.py', '/home/dafna/workspaces/unitxt/prepare/tasks/selection.py', '/home/dafna/workspaces/unitxt/prepare/tasks/ner.py', '/home/dafna/workspaces/unitxt/prepare/tasks/generation.py', '/home/dafna/workspaces/unitxt/prepare/tasks/evaluation.py', '/home/dafna/workspaces/unitxt/prepare/tasks/classification.py', '/home/dafna/workspaces/unitxt/prepare/tasks/targeted_sentiment_extraction.py', '/home/dafna/workspaces/unitxt/prepare/tasks/grammatical_error_correction.py', '/home/dafna/workspaces/unitxt/prepare/tasks/language_identification.py', '/home/dafna/workspaces/unitxt/prepare/tasks/span_labeling.py', '/home/dafna/workspaces/unitxt/prepare/tasks/summarization/abstractive.py', '/home/dafna/workspaces/unitxt/prepare/tasks/completion/multiple_choice.py', '/home/dafna/workspaces/unitxt/prepare/tasks/rag/response_generation.py', '/home/dafna/workspaces/unitxt/prepare/tasks/rag/rag_end_to_end.py', '/home/dafna/workspaces/unitxt/prepare/tasks/rag/__init__.py', '/home/dafna/workspaces/unitxt/prepare/tasks/qa/tasks.py', '/home/dafna/workspaces/unitxt/prepare/tasks/qa/multiple_choice/tasks.py', '/home/dafna/workspaces/unitxt/prepare/tasks/translation/directed.py', '/home/dafna/workspaces/unitxt/prepare/tasks/regression/tasks.py', '/home/dafna/workspaces/unitxt/prepare/tasks/response_assessment/pairwise_comparative_rating/single_turn.py', '/home/dafna/workspaces/unitxt/prepare/tasks/response_assessment/rating/single_turn.py', '/home/dafna/workspaces/unitxt/prepare/tasks/response_assessment/rating/single_turn_with_reference.py', '/home/dafna/workspaces/unitxt/prepare/tasks/response_assessment/rating/multi_turn_with_reference.py', '/home/dafna/workspaces/unitxt/prepare/tasks/response_assessment/rating/multi_turn.py', '/home/dafna/workspaces/unitxt/prepare/tasks/response_assessment/pairwise_comparison/single_turn.py', '/home/dafna/workspaces/unitxt/prepare/tasks/response_assessment/pairwise_comparison/single_turn_with_reference.py', '/home/dafna/workspaces/unitxt/prepare/tasks/response_assessment/pairwise_comparison/multi_turn_with_reference.py', '/home/dafna/workspaces/unitxt/prepare/tasks/response_assessment/pairwise_comparison/multi_turn.py', '/home/dafna/workspaces/unitxt/prepare/engines/model/flan.py', '/home/dafna/workspaces/unitxt/prepare/operators/balancers/per_task.py', '/home/dafna/workspaces/unitxt/prepare/splitters/missing_split.py', '/home/dafna/workspaces/unitxt/prepare/splitters/diverse_labels_sampler.py', '/home/dafna/workspaces/unitxt/prepare/metrics/precision_recall.py', '/home/dafna/workspaces/unitxt/prepare/metrics/rag_context_correctness.py', '/home/dafna/workspaces/unitxt/prepare/metrics/accuracy.py', '/home/dafna/workspaces/unitxt/prepare/metrics/meteor.py', '/home/dafna/workspaces/unitxt/prepare/metrics/rag_answer_correctness.py', '/home/dafna/workspaces/unitxt/prepare/metrics/regard_metric.py', '/home/dafna/workspaces/unitxt/prepare/metrics/f1.py', '/home/dafna/workspaces/unitxt/prepare/metrics/matthews_correlation.py', '/home/dafna/workspaces/unitxt/prepare/metrics/ndcg.py', '/home/dafna/workspaces/unitxt/prepare/metrics/rouge.py', '/home/dafna/workspaces/unitxt/prepare/metrics/kendalltau.py', '/home/dafna/workspaces/unitxt/prepare/metrics/code_mixing_detection.py', '/home/dafna/workspaces/unitxt/prepare/metrics/spearman.py', '/home/dafna/workspaces/unitxt/prepare/metrics/kpa.py', '/home/dafna/workspaces/unitxt/prepare/metrics/unsorted_list_exact_match.py', '/home/dafna/workspaces/unitxt/prepare/metrics/grouped_instance_metrics.py', '/home/dafna/workspaces/unitxt/prepare/metrics/retrieval.py', '/home/dafna/workspaces/unitxt/prepare/metrics/jaccard_index.py', '/home/dafna/workspaces/unitxt/prepare/metrics/custom_f1.py', '/home/dafna/workspaces/unitxt/prepare/metrics/wer.py', '/home/dafna/workspaces/unitxt/prepare/metrics/perplexity.py', '/home/dafna/workspaces/unitxt/prepare/metrics/char_edit_dist_accuracy.py', '/home/dafna/workspaces/unitxt/prepare/metrics/rag_context_relevance.py', '/home/dafna/workspaces/unitxt/prepare/metrics/mrr.py', '/home/dafna/workspaces/unitxt/prepare/metrics/unnormalized_sacrebleu.py', '/home/dafna/workspaces/unitxt/prepare/metrics/rag_faithfulness.py', '/home/dafna/workspaces/unitxt/prepare/metrics/rag.py', '/home/dafna/workspaces/unitxt/prepare/metrics/map.py', '/home/dafna/workspaces/unitxt/prepare/metrics/rerank_recall.py', '/home/dafna/workspaces/unitxt/prepare/metrics/customf1fuzzy.py', '/home/dafna/workspaces/unitxt/prepare/metrics/roc_auc.py', '/home/dafna/workspaces/unitxt/prepare/metrics/bleu.py', '/home/dafna/workspaces/unitxt/prepare/metrics/string_containment.py', '/home/dafna/workspaces/unitxt/prepare/metrics/safety_metric.py', '/home/dafna/workspaces/unitxt/prepare/metrics/llama_index_metrics.py', '/home/dafna/workspaces/unitxt/prepare/metrics/rag_answer_relevance.py', '/home/dafna/workspaces/unitxt/prepare/metrics/fin_qa.py', '/home/dafna/workspaces/unitxt/prepare/metrics/normalized_sacrebleu.py', '/home/dafna/workspaces/unitxt/prepare/metrics/squad.py', '/home/dafna/workspaces/unitxt/prepare/metrics/win_rate.py', '/home/dafna/workspaces/unitxt/prepare/metrics/llm_as_judge/llamaguard.py', '/home/dafna/workspaces/unitxt/prepare/metrics/llm_as_judge/pairwise_rating/llama_3_ibm_genai_arena_hard_template.py', '/home/dafna/workspaces/unitxt/prepare/metrics/llm_as_judge/rating/llama_3_ibm_genai_generic_template.py', '/home/dafna/workspaces/unitxt/prepare/metrics/llm_as_judge/rating/mistral_huggingface_mt_bench_template.py', '/home/dafna/workspaces/unitxt/prepare/metrics/llm_as_judge/rating/llama_3_ibm_genai_mt_bench_template.py']
/home/dafna/workspaces/unitxt/src/unitxt/dataclass.py:349: DeprecationWarning: Field 'use_query' is deprecated. From now on, default behavior is compatible to use_query=True. Please remove this field from your code.
original_init(self, *args, **kwargs)
/home/dafna/workspaces/unitxt/src/unitxt/dataclass.py:349: DeprecationWarning: The 'inputs' field is deprecated. Please use 'input_fields' instead.
original_init(self, *args, **kwargs)
/home/dafna/workspaces/unitxt/src/unitxt/dataclass.py:349: DeprecationWarning: The 'outputs' field is deprecated. Please use 'reference_fields' instead.
original_init(self, *args, **kwargs)
Interesting, I think it confused few people.
DeprecatedField is not a solution for catching offenders either. It only comes into effect when the class is built up and its fields -- these as declared and before being assigned non-init value -- are 'scanned'. If the field is a DeprecatedField, then the message of its metadata is printed out (as part of that scan). DeprecatedField does not solve the problem of an instantiation of that class with specific values (not the init-ones) assigned to its field, DeprecatedField or any Field:
(virtual39) dafna@DESKTOP-GM8R3J7:~/workspaces/unitxt$ cat tests/mine/copyfields.py
from unitxt.operators import Copy
from unitxt.stream import MultiStream
instance = {"predictions" : { "a" : 3, "b" :4}}
ms = MultiStream.from_iterables({"tmp" :[instance]})
copyoperator = Copy(field="predictions", to_field="predictions_orig", use_query=False)
ms = copyoperator(ms)
(virtual39) dafna@DESKTOP-GM8R3J7:~/workspaces/unitxt$ python -m tests.mine.copyfields
(virtual39) dafna@DESKTOP-GM8R3J7:~/workspaces/unitxt$
The solution is probably to migrate each case (each class in which we depracate a field) to the treatment of the verify
of that class. As done for Task.
Solving overall just through an attribute of the Field, seems useless.
And perhaps get rid of DeprecatedField that all it does is confusing your users, that see that message printed by dataclass at the time of digesting the general (not specific) definition of the class.
Generally speaking, some types of class args (e.g. Callable) can only be assigned via Field, and not just =. In other words: fields scanned when adding to the catalog are treated differently than fields operated later. I am not sure if this is intentional, or a side-effect of the use of catalog.
A clear demonstration of source of confusion for many people:
import unittest
from unitxt.utils import import_module_from_file
file_path = "examples/evaluate_different_templates_num_demos.py"
class TestExamples(unittest.TestCase):
def test_examples(self):
import_module_from_file(file_path)
if __name__ == "__main__":
te = TestExamples()
te.test_examples()
run the above in two ways:
python -m <the above>
and
python -m unittest <the above>
and confusingly enough, only the second run shows deprecation errors
Hi @elronbandel , please see if you want to do anything 'radical' about the different initialization/printouts of classes and fields in/out testing environment, as demonstrated above. DeprecatedField as it is today, seems completely useless, in both envs.
solved by https://github.com/IBM/unitxt/pull/1174
where? Copy has 'use_query' as a deprecated field.