Open jonathanlastmileai opened 8 months ago
It would be nice to be able to create a Metric from a function with less required configuration.
async def _correct_fn(datum: str, expected: str) -> bool: return json.loads(datum) == json.loads(expected) def correct_function(expected: str): return metrics.Metric( evaluation_fn=partial(_correct_fn, expected=expected), metric_metadata=common.EvaluationMetricMetadata( name="correct_function", description="True (pass) if function call is correct", best_value=True, worst_value=False, extra_metadata=dict(expected=expected), ), )
It must expose the comparison function for the user to give in one way or another. Decorator pattern is preferred.
can you provide more context on why this is needed or an example use case?
It would be nice to be able to create a Metric from a function with less required configuration.
It must expose the comparison function for the user to give in one way or another. Decorator pattern is preferred.