AI4Finance-Foundation / FinGPT

FinGPT: Open-Source Financial Large Language Models! Revolutionize 🔥 We release the trained model on HuggingFace.
https://ai4finance.org
MIT License
12.75k stars 1.81k forks source link

please how to call it locally #185

Open NanshaNansha opened 3 weeks ago

NanshaNansha commented 3 weeks ago

ae406fbe183e2be0bec4dc8fcf8e4f7

Siddharth-Latthe-07 commented 1 week ago

@NanshaNansha To load a model locally using the PeftModel class from a pretrained model, you need to ensure that the base_model and other required files are available locally. Try out these steps and let me know, if it works

  1. Install the dependencies:- pip install transformers peft
  2. Prepare Local Paths: Set the local paths where your pretrained model and cache directory are located. example snippet:-
    
    from peft import PeftModel

Define the local path to the base model and the cache directory

base_model_path = 'path/to/your/base_model' model_id = 'path/to/your/local_model_directory' cache_dir = 'path/to/your/cache_directory'

Load the model from the local path

model = PeftModel.from_pretrained(base_model_path, model_id=model_id, cache_dir=cache_dir)

Example: Use the model for inference

Make sure you have the tokenizer and other necessary components

from transformers import AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained(base_model_path) input_text = "Your input text here"

inputs = tokenizer(input_text, return_tensors="pt") outputs = model(**inputs)

print(outputs)


Let me know, if it works
Thanks