mrousavy / react-native-fast-tflite

🔥 High-performance TensorFlow Lite library for React Native with GPU acceleration
https://mrousavy.com
MIT License
702 stars 30 forks source link

Is it support LLM like Llama? #89

Open eagle705 opened 1 month ago

eagle705 commented 1 month ago

I would like to use Llama in this library with huggingface tokenizer.

Is it possible? Is there a plan to support it? :)

I would like to use it in iOS too. So I expect to run model with CoreML

maintenance-hans[bot] commented 1 month ago

Guten Tag, Hans here! 🍻

Thanks for your interest in using Llama with ze library! Currently, ze support for Llama or its integration with huggingface tokenizer is not explicitly mentioned in ze documentation. However, if you’d like to see it supported in ze future, it could help to open a feature request.

For now, if you have specific implementation challenges or ze logs when trying to run it on iOS or CoreML, please share those details so mrousavy can better understand ze issue.

If you enjoy ze project and want to support mrousavy, consider sponsoring it here – it helps keep ze project alive!

Note: If you think I made a mistake, please ping @mrousavy to take a look.