neulab / ExplainaBoard

Interpretable Evaluation for AI Systems
MIT License
361 stars 36 forks source link

Add text to sql eval metric clean #580

Closed shuaichenchang closed 1 year ago

shuaichenchang commented 1 year ago

Add text-to-sql evaluation metrics

Overview

This PR is the first of three PRs to add text-to-sql evaluation metrics.

  1. Add exact set match accuracy and execution accuracy metrics, and test files
  2. Add loader and integration test
  3. Add processors

Details

This PR adds metrics in explainaboard/metrics, and test functions in integration_tests/metric_test.py. The metrics use the code from explainaboard/third_party/text_to_sql_test_suit_eval for text-to-sql task evaluations.