daviddwlee84 / DeepLearningPractice

Basically the neural network based implementation and corresponding notes.
25 stars 12 forks source link

GPT2 Collection #13

Open daviddwlee84 opened 3 years ago

daviddwlee84 commented 3 years ago
Code Support Chinese Framework Remark
openai/gpt-2: Code for the paper "Language Models are Unsupervised Multitask Learners" No Tensorflow 1.x Official (Open AI) one; Better Language Models and Their Implications
minimaxir/gpt-2-simple: Python package to easily retrain OpenAI's GPT-2 text-generating model on new texts No Tensorflow 1.x minimaxir/textgenrnn: Easily train your own text-generating neural network of any size and complexity on any text dataset with a few lines of code.; OpenAI「假新闻」生成器GPT-2的最简Python实现 - 知乎
yangjianxin1/GPT2-chitchat: GPT2 for Chinese chitchat/用于中文闲聊的GPT2模型(实现了DialoGPT的MMI思想) Yes PyTorch Very cool project based on Huggingface; 用于中文闲聊的GPT2模型:GPT2-chitchat - 知乎
rish-16/gpt2client: ✍🏻 gpt2-client: Easy-to-use TensorFlow Wrapper for GPT-2 117M, 345M, 774M, and 1.5B Transformer Models 🤖 📝 No TensorFlow 1.x 新加坡高中生开源轻量级GPT-2“客户端”:五行代码玩转GPT-2 - 知乎
Morizeyao/GPT2-Chinese: Chinese version of GPT2 training code, using BERT tokenizer. Yes PyTorch Based on Huggingface

Huggingface

GPT2LMHeadModel

TODO: WWM?!

daviddwlee84 commented 3 years ago

Huggingface Chinese Models