swift-transformers
This is a collection of utilities to help adopt language models in Swift apps. It tries to follow the Python transformers
API and abstractions whenever possible, but it also aims to provide an idiomatic Swift interface and does not assume prior familiarity with transformers
or tokenizers
.
Please, check our post.
Tokenizers
. Utilities to convert text to tokens and back. Follows the abstractions in tokenizers
and transformers.js
. Usage example:import Tokenizers
func testTokenizer() async throws {
let tokenizer = try await AutoTokenizer.from(pretrained: "pcuenq/Llama-2-7b-chat-coreml")
let inputIds = tokenizer("Today she took a train to the West")
assert(inputIds == [1, 20628, 1183, 3614, 263, 7945, 304, 278, 3122])
}
However, you don't usually need to tokenize the input text yourself - the Generation
code will take care of it.
Hub
. Utilities to download configuration files from the Hub, used to instantiate tokenizers and learn about language model characteristics.
Generation
. Algorithms for text generation. Currently supported ones are greedy search and top-k sampling.
Models
. Language model abstraction over a Core ML package.
This package has been tested with autoregressive language models such as:
Encoder-decoder models such as T5 and Flan are currently not supported. They are high up in our priority list.
swift-chat
, a simple app demonstrating how to use this package.exporters
, a Core ML conversion package for transformers models, based on Apple's coremltools
.transformers-to-coreml
, a no-code Core ML conversion tool built on exporters
.To use swift-transformers
with SwiftPM, you can add this to your Package.swift
:
dependencies: [
.package(url: "https://github.com/huggingface/swift-transformers", from: "0.1.5")
]
And then, add the Transformers library as a dependency to your target:
targets: [
.target(
name: "YourTargetName",
dependencies: [
.product(name: "Transformers", package: "swift-transformers")
]
)
]
tokenizers
tokenizer_config.json
for known architectures whose models don't have a configuration in the Hub (GPT2)exporters
– Core ML conversion tool.
logits
from converted Core ML modelcoremltools
@ main
for latest fixes. In particular, this merged PR makes it easier to use recent versions of transformers.top-k
implementation in Accelerate
.