Closed Jameswlepage closed 2 months ago
Thank you for your interest in TransformersPHP and your enthusiasm for supporting Phi models! Phi 3 is indeed an exciting development, and I've seen promising results with it.
However, the current structure of TransformersPHP was not designed to handle models larger than 2GB. This is because ONNX splits larger models into an ONNX file and an onnx_data file, which TransformersPHP doesn't yet support.
I'm currently working on adding model support for Phi 1.5 and Phi 2 models, and these should be ready in the next release. As for supporting models with external data files, I don't have a set timeline yet. The next version release focuses on massive performance improvements overall, so I'll consider adding that feature afterwards.
If you have the time and interest, you're welcome to contribute to the project. Your help would be greatly appreciated!
Type of feature request
🌟New Model
Feature description
Implementing
PhiForCausalLM
and supporting the new Phi-3 (specifically this one) model would be incredible as these models can reasonably be run locally.https://huggingface.co/docs/transformers/v4.40.0/en/model_doc/phi#overview
Motivation
Phi 3 looks to be competitive with models like Llama 3 8B, despite being only around 3 billion parameters. This size makes it ideal to run locally, opening up possibilities for completely private AI.