EricFillion / happy-transformer

Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.
http://happytransformer.com
Apache License 2.0
517 stars 66 forks source link
ai artificial-intelligence bert deep-learning language-models machine-learning natural-language-processing nlp python question-answering roberta text-classification transformers

License Downloads Website shields.io PyPI

Happy Transformer

Documentation and news: happytransformer.com

Join our Discord server: Support Server

HappyTransformer

Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.

3.0.0

  1. DeepSpeed for training
  2. Apple's MPS for training and inference
  3. WandB to track training runs
  4. Data supplied for training is automatically split into portions for training and evaluating
  5. Push models directly to Hugging Face's Model Hub

Read about the full 3.0.0 update including breaking changes here.

Tasks

Tasks Inference Training
Text Generation
Text Classification
Word Prediction
Question Answering
Text-to-Text
Next Sentence Prediction
Token Classification

Quick Start

pip install happytransformer

from happytransformer import HappyWordPrediction
#--------------------------------------#
happy_wp = HappyWordPrediction()  # default uses distilbert-base-uncased
result = happy_wp.predict_mask("I think therefore I [MASK]")
print(result)  # [WordPredictionResult(token='am', score=0.10172799974679947)]
print(result[0].token)  # am

Maintainers

Tutorials

Text generation with training (GPT-Neo)

Text classification (training)

Text classification (hate speech detection)

Text classification (sentiment analysis)

Word prediction with training (DistilBERT, RoBERTa)

Top T5 Models

Grammar Correction

Fine-tune a Grammar Correction Model