Open devgupta6 opened 10 months ago
Hi @devgupta6, thanks for opening an issue!
The easiest and recommended way to make a model available in transformers
is to add the modeling code directly on the hub: https://huggingface.co/docs/transformers/custom_models. Here is a more general guide on adding models: https://huggingface.co/docs/transformers/add_new_model
Hi @amyeroberts I had registered my LLM model using automodel like this - AutoModelForCausalLM.register(CustomAIConfig, CustomAI) but it is showing the error - --------------------------------------------------------------------------- ImportError Traceback (most recent call last)
Could you link to your model on the hub? Without seeing the code I'm not able to see what the issue might be.
Yes, sure. https://github.com/devgupta6/dev-ai this is the model link
@devgupta6 Please read the documentation pages I sent over as these contain all the information you should need. The model at the moment is just a torch implementation and doesn't have any of the necessary adaptations for the transformers library. For example, your model needs to inherit from PretrainedModel
. We can help with bugs and issues but we can't write your code for you.
Model description
It a generative artifical intelligence model. I have the architecture ready but I am facing the problem to interagte it with transformer architure. It would be great if you provide me assistance with this.
Open source status
Provide useful links for the implementation
No response