NASA-IMPACT / evalem

An evaluation framework for your large model pipelines
0 stars 0 forks source link

integration with huggingface transformers? #18

Closed rbavery closed 8 months ago

rbavery commented 1 year ago

This is a ticket to discuss if we should make transformers as a dependency to this repo to preprocess data inputs and load models for evaluation pipelines. See https://github.com/NASA-IMPACT/hls-foundation/issues/3 for more discussion about the features of transformers and how they might fit the distribution and use of the geospatial FM in particular.

The transformers library contains countless models for text and computer vision. Each model when published to this library can be easily downloaded, configured, and data inputs are processed with model specific processors. transformers also organizes base models such that they can be used for downstream tasks.

cc @weiji14 @NISH1001 @muthukumaranR

NISH1001 commented 1 year ago

@rbavery Right now for LLM, we do wrap around HF pipeline as HFPipelineWrapper see, which itself is of type[ModelWrapper].

Currently, we have QuestionAnsweringHFPipelineWrapper and TextClassificationHFPipelineWrapper.

The idea is to have one wrapper type -- more like framework-agnostic implementation-- around any upstream model (say transformers, maybe other ML models, etc.). So, the upstream HF pipeline could handle most of the preprocessing. However, we also provide a mechanism to inject our custom preprocessor and postprocessor that can be performed way before/after forward-passing through the PipelineWrappers.

Maybe, we could do something for cv as well? If so, we might have to have some differentiation on the type of uwrapper around LLM or Vision.