llm.go
[![Status](https://img.shields.io/badge/status-active-success.svg)]()
[![GitHub Issues](https://img.shields.io/github/issues/joshcarp/llm.go)](https://github.com/joshcarp/grpctl/issues)
[![GitHub Pull Requests](https://img.shields.io/github/issues-pr/joshcarp/llm.go)](https://github.com/joshcarp/grpctl/pulls)
[![License](https://img.shields.io/badge/license-apache2-blue.svg)](/LICENSE)
GPT-2 implementation written in go only using the standard library.
🪞 Quick start
Install python dependencies, output tokenized dataset
make setup
Run the training script:
make train
This will run go run ./cmd/traingpt2/main.go
Run the testing script:
make test
This will run go run ./cmd/testgpt2/main.go
TODO
- [ ] Tokenize input text, the implementation of this is incorrect. Need to do pair matching not tries
- [ ] Very slow, need to improve performance.
- [ ] It runs in WASM but using WebGPU bindings might be fun.
- [ ] More refactoring.
- [ ] Running as CLI.
🖋️ License
See LICENSE for more details.
🎉 Acknowledgements
-
- This is a fork of Andrej Karpathy's llm.c written in pure go.