inspired-co / eaas_client

Evaluation as a Service for NLP
https://inspired-co.github.io/eaas_client/
Apache License 2.0
9 stars 1 forks source link

Evaluation-as-a-Service for NLP



License GitHub stars PyPI Integration Tests

Usage

Before using EaaS, please see the terms of use. Detailed documentation can be found here. To install the EaaS, simply run

pip install eaas

Run your "Hello, world"

A minimal EaaS application looks something like this:

from eaas import Config, Client

client = Client(Config())

inputs = [{
    "source": "Hello, my world",
    "references": ["Hello, world", "Hello my world"],
    "hypothesis": "Hi, my world"
}]
metrics = ["rouge1", "bleu", "chrf"]

score_dic = client.score(inputs, metrics=metrics)

If eaas has been installed successfully, you should get the results below by printing score_dic. Each entry corresponds to the metrics passed to metrics (in the same order). The corpus entry indicates the corpus-level score, sample entry is a list of sample-level scores:

score_dic = {'scores':
     [
         {'corpus': 0.6666666666666666, 'sample': [0.6666666666666666]},
         {'corpus': 0.35355339059327373, 'sample': [0.35355339059327373]},
         {'corpus': 0.4900623006253688, 'sample': [0.4900623006253688]}
     ]
}

Notably:

Supported Metrics

Currently, EaaS supports the following metrics:

The default configurations for each metric can refer to this doc

Asynchronous Requests

If you want to make a call to the EaaS server to calculate some metrics and continue local computation while waiting for the result, you can do so as follows:

from eaas import Config
from eaas.async_client import AsyncClient

config = Config()
client = AsyncClient(config)

inputs = ...
req = client.async_score(inputs, metrics=["bleu"])
# do some other computation
result = req.get_result()