ProjectUnifree / unifree

MIT License
1.43k stars 74 forks source link

Added support for multi-process local LLM #41

Closed bshikin closed 1 year ago

bshikin commented 1 year ago

Add TestMultiprocessLocalLLM, a class that can wrap an LLM implementation and run inference (aka query method) in separate processes on the local box.

The configuration would look like:


llm:
  class: MultiprocessLocalLLM
  llm_config:
      class: <wrapped LLM class>
      config <wrapped LLM config>

    wrapper_config
      num_workers: 5,
      query_timeout_sec: 10