swift-transformers
is a collection of utilities to help adopt language models in Swift apps.
Those familiar with the transformers
Python library will find a familiar yet idiomatic Swift API.
Check out our v1.0 release post and our original announcement for more context on why we built this library.
The most commonly used modules from swift-transformers
are Tokenizers
and Hub
, which allow fast tokenization and
model downloads from the Hugging Face Hub.
Tokenizing text should feel very familiar to those who have used the Python transformers
library:
let tokenizer = try await AutoTokenizer.from(pretrained: "deepseek-ai/DeepSeek-R1-Distill-Qwen-7B")
let messages = [["role": "user", "content": "Describe the Swift programming language."]]
let encoded = try tokenizer.applyChatTemplate(messages: messages)
let decoded = tokenizer.decode(tokens: encoded)
swift-transformers
natively supports formatting inputs for tool calling, allowing for complex interactions with language models:
let tokenizer = try await AutoTokenizer.from(pretrained: "mlx-community/Qwen2.5-7B-Instruct-4bit")
let weatherTool = [
"type": "function",
"function": [
"name": "get_current_weather",
"description": "Get the current weather in a given location",
"parameters": [
"type": "object",
"properties": ["location": ["type": "string", "description": "City and state"]],
"required": ["location"]
]
]
]
let tokens = try tokenizer.applyChatTemplate(
messages: [["role": "user", "content": "What's the weather in Paris?"]],
tools: [weatherTool]
)
Downloading models to a user device fast and reliably is a core requirement of on-device ML. swift-transformers
provides a simple API to
download models from the Hugging Face Hub, with progress reporting, flaky connection handling, and more:
let repo = Hub.Repo(id: "mlx-community/Qwen2.5-0.5B-Instruct-2bit-mlx")
let modelDirectory: URL = try await Hub.snapshot(
from: repo,
matching: ["config.json", "*.safetensors"],
progressHandler: { progress in
print("Download progress: \(progress.fractionCompleted * 100)%")
}
)
print("Files downloaded to: \(modelDirectory.path)")
The Models
and Generation
modules provide handy utilities when working with language models in CoreML. Check out our
example converting and running Mistral 7B using CoreML here.
The modernization of Core ML and corresponding examples were primarily contributed by @joshnewnham, @1duo, @alejandro-isaza, @aseemw. Thank you 🙏
To use swift-transformers
with SwiftPM, you can add this to your Package.swift
:
dependencies: [
.package(url: "https://github.com/huggingface/swift-transformers", from: "0.1.17")
]
And then, add the Transformers library as a dependency to your target:
targets: [
.target(
name: "YourTargetName",
dependencies: [
.product(name: "Transformers", package: "swift-transformers")
]
)
]
- WhisperKit: A Swift Package for state-of-the-art speech-to-text systems from Argmax
- MLX Swift Examples: A Swift Package for integrating MLX models in Swift apps.
Using swift-transformers
in your project? Let us know and we'll add you to the list!
swift-chat
, a simple app demonstrating how to use this package.exporters
, a Core ML conversion package for transformers models, based on Apple'scoremltools
.
Swift Transformers is a community project and we welcome contributions. Please
check out Issues
tagged with good first issue
if you are looking for a place to start!
Before submitting a pull request, please ensure your code:
- Passes the test suite (
swift test
) - Passes linting checks (
swift format lint --recursive .
)
To format your code, run swift format -i --recursive .
.