nat / openplayground

An LLM playground you can run on your laptop
MIT License
6.17k stars 478 forks source link

Can this support cpp metal? #89

Closed bbecausereasonss closed 1 year ago

bbecausereasonss commented 1 year ago

Recently released for M1 inference?

zainhuda commented 1 year ago

This project is for running multiple models in a standardized interface. Model inference can happen in any method you'd like (external providers, local inference, etc), so yeah, we can support cpp metal. You would need to connect the inference code to the backend, and provide a text_generation method that can be called from the web interface. Take a look at documentation for adding models. It'll be very similar to that.

bbecausereasonss commented 1 year ago

Okay will try thanks!