svilupp / PromptingTools.jl

Streamline your life using PromptingTools.jl, the Julia package that simplifies interacting with large language models.
https://svilupp.github.io/PromptingTools.jl/dev/
MIT License
119 stars 13 forks source link

[Question/FR] Using some HugginFace models #127

Closed camilogarciabotero closed 5 months ago

camilogarciabotero commented 5 months ago

Hi @svilupp

This package is an absolute piece of magic. Thanks for taking your time on it. I was wondering (i could not find info in the docs) how to use HugginFace models with PromptingTools.jl? I was interested in bringing some particular models for biological analyses. Those are some transformers models. Any chance to have access to them from PromptingTools.jl?

svilupp commented 5 months ago

Hi there! Glad it’s helpful.

PromptingTools is only a wrapper calling the underlying API (eg, HuggingFace Inference Endpoints) or locally hosted services (eg, HF Text Generation Interface).

For large language models, they both support the OpenAI-compatible messages schema, so you can use them with PT with schema CustomOpenAISchema and providing the corresponding API and URL.

Is this what you’re trying to do? To have a conversations about some biology domain?

If you’re looking for biology-specific generation (as per the link), I don’t think you need messages and prompting - it would just stand in your way. They seem to use the older transformers models - like Bert and T5 - finetuned for specific task. You can use them directly with HF TGI or in some cases with Transformers.jl (look for Bert and T5-based models, those should be implemented).

Have a look here: https://chengchingwen.github.io/Transformers.jl/dev/huggingface/ With a bit of luck you can directly use them in Julia. If not, I’d suggest using PythonCall + HF TGI: https://huggingface.co/docs/text-generation-inference/en/index

Does that answer your question?

camilogarciabotero commented 5 months ago

Thank you @svilupp I will take a look at Transformers.jl!