AI4Finance-Foundation / FinGPT

FinGPT: Open-Source Financial Large Language Models! Revolutionize 🔥 We release the trained model on HuggingFace.
https://ai4finance.org
MIT License
13.98k stars 1.93k forks source link
chatgpt finance fingpt fintech large-language-models machine-learning nlp prompt-engineering pytorch reinforcement-learning robo-advisor sentiment-analysis technical-analysis
image

FinGPT: Open-Source Financial Large Language Models

Downloads Downloads Python 3.8 PyPI License

Let us not expect Wall Street to open-source LLMs or open APIs, due to FinTech institutes' internal regulations and policies.

Blueprint of FinGPT

https://huggingface.co/FinGPT

Visitors

What's New:

Why FinGPT?

1). Finance is highly dynamic. BloombergGPT trained an LLM using a mixture of finance data and general-purpose data, which took about 53 days, at a cost of around $3M). It is costly to retrain an LLM model like BloombergGPT every month or every week, thus lightweight adaptation is highly favorable. FinGPT can be fine-tuned swiftly to incorporate new data (the cost falls significantly, less than $300 per fine-tuning).

2). Democratizing Internet-scale financial data is critical, say allowing timely updates of the model (monthly or weekly updates) using an automatic data curation pipeline. BloombergGPT has privileged data access and APIs, while FinGPT presents a more accessible alternative. It prioritizes lightweight adaptation, leveraging the best available open-source LLMs.

3). The key technology is "RLHF (Reinforcement learning from human feedback)", which is missing in BloombergGPT. RLHF enables an LLM model to learn individual preferences (risk-aversion level, investing habits, personalized robo-advisor, etc.), which is the "secret" ingredient of ChatGPT and GPT4.

Milestone of AI Robo-Advisor: FinGPT-Forecaster

Try the latest released FinGPT-Forecaster demo at our HuggingFace Space

The dataset for FinGPT-Forecaster: https://huggingface.co/datasets/FinGPT/fingpt-forecaster-dow30-202305-202405

demo_interface

Enter the following inputs:

1) ticker symbol (e.g. AAPL, MSFT, NVDA) 2) the day from which you want the prediction to happen (yyyy-mm-dd) 3) the number of past weeks where market news are retrieved 4) whether to add the latest basic financials as additional information

Click Submit! And you'll be responded with a well-rounded analysis of the company and a prediction for next week's stock price movement!

For detailed and more customized implementation, please refer to FinGPT-Forecaster

FinGPT Demos:

Current State-of-the-arts for Financial Sentiment Analysis

Instruction Tuning Datasets and Models

The datasets we used, and the multi-task financial LLM models are available at https://huggingface.co/FinGPT

Our Code

Datasets Train Rows Test Rows Description
fingpt-sentiment-train 76.8K N/A Sentiment Analysis Training Instructions
fingpt-finred 27.6k 5.11k Financial Relation Extraction Instructions
fingpt-headline 82.2k 20.5k Financial Headline Analysis Instructions
fingpt-ner 511 98 Financial Named-Entity Recognition Instructions
fingpt-fiqa_qa 17.1k N/A Financial Q&A Instructions
fingpt-fineval 1.06k 265 Chinese Multiple-Choice Questions Instructions

Multi-task financial LLMs Models:

  demo_tasks = [
      'Financial Sentiment Analysis',
      'Financial Relation Extraction',
      'Financial Headline Classification',
      'Financial Named Entity Recognition',]
  demo_inputs = [
      "Glaxo's ViiV Healthcare Signs China Manufacturing Deal With Desano",
      "Apple Inc. Chief Executive Steve Jobs sought to soothe investor concerns about his health on Monday, saying his weight loss was caused by a hormone imbalance that is relatively simple to treat.",
      'gold trades in red in early trade; eyes near-term range at rs 28,300-28,600',
      'This LOAN AND SECURITY AGREEMENT dated January 27 , 1999 , between SILICON VALLEY BANK (" Bank "), a California - chartered bank with its principal place of business at 3003 Tasman Drive , Santa Clara , California 95054 with a loan production office located at 40 William St ., Ste .',]
  demo_instructions = [
      'What is the sentiment of this news? Please choose an answer from {negative/neutral/positive}.',
      'Given phrases that describe the relationship between two words/phrases as options, extract the word/phrase pair and the corresponding lexical relationship between them from the input text. The output format should be "relation1: word1, word2; relation2: word3, word4". Options: product/material produced, manufacturer, distributed by, industry, position held, original broadcaster, owned by, founded by, distribution format, headquarters location, stock exchange, currency, parent organization, chief executive officer, director/manager, owner of, operator, member of, employer, chairperson, platform, subsidiary, legal form, publisher, developer, brand, business division, location of formation, creator.',
      'Does the news headline talk about price going up? Please choose an answer from {Yes/No}.',
      'Please extract entities and their types from the input sentence, entity types should be chosen from {person/organization/location}.',]
Models Description Function
fingpt-mt_llama2-7b_lora Fine-tuned Llama2-7b model with LoRA Multi-Task
fingpt-mt_falcon-7b_lora Fine-tuned falcon-7b model with LoRA Multi-Task
fingpt-mt_bloom-7b1_lora Fine-tuned bloom-7b1 model with LoRA Multi-Task
fingpt-mt_mpt-7b_lora Fine-tuned mpt-7b model with LoRA Multi-Task
fingpt-mt_chatglm2-6b_lora Fine-tuned chatglm-6b model with LoRA Multi-Task
fingpt-mt_qwen-7b_lora Fine-tuned qwen-7b model with LoRA Multi-Task
fingpt-sentiment_llama2-13b_lora Fine-tuned llama2-13b model with LoRA Single-Task
fingpt-forecaster_dow30_llama2-7b_lora Fine-tuned llama2-7b model with LoRA Single-Task

Tutorials

[Training] Beginner’s Guide to FinGPT: Training with LoRA and ChatGLM2–6B One Notebook, $10 GPU

Understanding FinGPT: An Educational Blog Series

FinGPT Ecosystem

FinGPT embraces a full-stack framework for FinLLMs with five layers:

  1. Data source layer: This layer assures comprehensive market coverage, addressing the temporal sensitivity of financial data through real-time information capture.
  2. Data engineering layer: Primed for real-time NLP data processing, this layer tackles the inherent challenges of high temporal sensitivity and low signal-to-noise ratio in financial data.
  3. LLMs layer: Focusing on a range of fine-tuning methodologies such as LoRA, this layer mitigates the highly dynamic nature of financial data, ensuring the model’s relevance and accuracy.
  4. Task layer: This layer is responsible for executing fundamental tasks. These tasks serve as the benchmarks for performance evaluations and cross-comparisons in the realm of FinLLMs
  5. Application layer: Showcasing practical applications and demos, this layer highlights the potential capability of FinGPT in the financial sector.

Open-Source Base Model used in the LLMs layer of FinGPT

Base Model Pretraining Tokens Context Length Model Advantages Model Size Experiment Results Applications
Llama-2 2 Trillion 4096 Llama-2 excels on English-based market data llama-2-7b and Llama-2-13b llama-2 consistently shows superior fine-tuning results Financial Sentiment Analysis, Robo-Advisor
Falcon 1,500B 2048 Maintains high-quality results while being more resource-efficient falcon-7b Good for English market data Financial Sentiment Analysis
MPT 1T 2048 MPT models can be trained with high throughput efficiency and stable convergence mpt-7b Good for English market data Financial Sentiment Analysis
Bloom 366B 2048 World’s largest open multilingual language model bloom-7b1 Good for English market data Financial Sentiment Analysis
ChatGLM2 1.4T 32K Exceptional capability for Chinese language expression chatglm2-6b Shows prowess for Chinese market data Financial Sentiment Analysis, Financial Report Summary
Qwen 2.2T 8k Fast response and high accuracy qwen-7b Effective for Chinese market data Financial Sentiment Analysis
InternLM 1.8T 8k Can flexibly and independently construct workflows internlm-7b Effective for Chinese market data Financial Sentiment Analysis

All Thanks To Our Contributors :

News

ChatGPT at AI4Finance

Introductory

The Journey of Open AI GPT models. GPT models explained. Open AI's GPT-1, GPT-2, GPT-3.

(Financial) Big Data

Interesting Demos

ChatGPT for FinTech

ChatGPT Trading Bot

## Citing FinGPT ``` @article{yang2023fingpt, title={FinGPT: Open-Source Financial Large Language Models}, author={Yang, Hongyang and Liu, Xiao-Yang and Wang, Christina Dan}, journal={FinLLM Symposium at IJCAI 2023}, year={2023} } @article{zhang2023instructfingpt, title={Instruct-FinGPT: Financial Sentiment Analysis by Instruction Tuning of General-Purpose Large Language Models}, author={Boyu Zhang and Hongyang Yang and Xiao-Yang Liu}, journal={FinLLM Symposium at IJCAI 2023}, year={2023} } @article{zhang2023fingptrag, title={Enhancing Financial Sentiment Analysis via Retrieval Augmented Large Language Models}, author={Zhang, Boyu and Yang, Hongyang and Zhou, tianyu and Babar, Ali and Liu, Xiao-Yang}, journal = {ACM International Conference on AI in Finance (ICAIF)}, year={2023} } @article{wang2023fingptbenchmark, title={FinGPT: Instruction Tuning Benchmark for Open-Source Large Language Models in Financial Datasets}, author={Wang, Neng and Yang, Hongyang and Wang, Christina Dan}, journal={NeurIPS Workshop on Instruction Tuning and Instruction Following}, year={2023} } @article{2023finnlp, title={Data-centric FinGPT: Democratizing Internet-scale Data for Financial Large Language Models}, author={Liu, Xiao-Yang and Wang, Guoxuan and Yang, Hongyang and Zha, Daochen}, journal={NeurIPS Workshop on Instruction Tuning and Instruction Following}, year={2023} } ```
## LICENSE MIT License **Disclaimer: We are sharing codes for academic purposes under the MIT education license. Nothing herein is financial advice, and NOT a recommendation to trade real money. Please use common sense and always first consult a professional before trading or investing.**