Closed jackabald closed 3 weeks ago
Any update will be great. @jackabald
https://huggingface.co/models?pipeline_tag=text-generation
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("meta-llama/Meta-Llama-3.1-8B") model = AutoModelForCausalLM.from_pretrained("meta-llama/Meta-Llama-3.1-8B")
Any lookup here? https://www.npi.ai/docs
So my research with Dify has kind of led me astray. I don't know if there is any implementation we could use specifically, I think it is more of a way to develop LLMs and RAGs in a low code way using agents. I think dify is better used if we were making like a new project.
Thanks for that. plz validate this idea of having the use of tokenizer
from transformers import AutoTokenizer, AutoModelForCausalLM
https://docs.dify.ai/guides/model-configuration