CarperAI / cheese

Used for adaptive human in the loop evaluation of language and embedding models.
MIT License
300 stars 24 forks source link

Batched model input #29

Closed shahbuland closed 1 year ago

shahbuland commented 1 year ago

Currently, model creates a queue of tasks sent to it by client of pipeline, then handles them one at a time before sending back. If processing can be done fast enough that the queue never starts to accumulate many tasks, this should be fine. However, it is very likely there will be a case where the queue starts to fill up faster than model running on unbatched data can keep up with it. Need to collate date and allow model processing to be done in batched form. Current Idea:

shahbuland commented 1 year ago

Resolved in #31