SwiftAI: A Modern Swift Library for Building AI-Powered Apps
In today’s tech world, artificial intelligence (AI) is becoming more and more important in app development. Whether you’re creating a simple chat app or a complex tool that needs smart responses, having a reliable way to work with AI models is key. That’s where SwiftAI comes in. SwiftAI is a modern, type-safe Swift library designed to make building AI-powered apps easier than ever. It provides a unified interface that works smoothly with different AI models—from Apple’s on-device models to popular cloud-based services like OpenAI. Let’s take a closer look at what SwiftAI offers, how to use it, and why it might be the right choice for your next project.
What Makes SwiftAI Stand Out?
SwiftAI comes packed with features that make it a powerful tool for developers. Let’s break down its main advantages:
-
Works with Any Model: One of the biggest strengths of SwiftAI is that it’s “model agnostic.” This means it uses the same simple interface no matter which AI model you’re using. Whether you’re working with Apple’s on-device models, OpenAI, Anthropic, or even a custom backend you’ve built, you won’t have to rewrite your code.
-
Structured Outputs: Instead of getting messy, unorganized text from AI, SwiftAI lets you get data in a structured format that your app can use right away. It uses Swift’s strong typing to make sure the data fits exactly what your code expects—no more guessing if the AI’s response will break your app.
-
Tool Use Support: SwiftAI makes it easy for AI to use tools (like weather APIs or calculators) in your app. The AI can automatically decide when to use these tools to get real-time information, so you don’t have to manually trigger them.
-
Conversations That Remember: If you’re building a chat app, SwiftAI’s
Chat
feature keeps track of the conversation history. This means the AI remembers what was said earlier, making interactions feel more natural and连贯 (coherent). -
Easy to Extend: SwiftAI has a plugin system that lets you add custom models or tools. This means you can adapt it to fit your specific needs without starting from scratch.
-
Built for Modern Swift: It uses Swift’s latest features like
async/await
and concurrency, making your code cleaner and easier to manage.
Getting Started with SwiftAI
Let’s dive into how to use SwiftAI. We’ll start with the basics and gradually move to more advanced features.
Step 1: Install SwiftAI
Before you can use SwiftAI, you need to install it. The easiest way is through Swift Package Manager, which works with Xcode and other Swift tools.
Using Xcode:
-
Open your Xcode project. -
Go to File → Add Package Dependencies. -
In the search bar, enter: https://github.com/mi12labs/SwiftAI
-
Click Add Package to finish the installation.
Using Package.swift:
If you’re using a Package.swift
file for your project, add this line to your dependencies:
dependencies: [
.package(url: "https://github.com/mi12labs/SwiftAI", from: "main")
]
Once installed, you’re ready to start using SwiftAI in your code.
Step 2: Your First AI Query
Let’s start with the simplest thing you can do: ask the AI a question and get an answer. Here’s how:
import SwiftAI
// Initialize Apple's on-device language model.
let llm = SystemLLM()
// Ask a question and get a response.
let response = try await llm.reply(to: "What is the capital of France?")
print(response.content) // "Paris"
Let’s break this down:
-
SystemLLM()
creates an instance of Apple’s on-device AI model. This model runs directly on your iPhone or Mac, so it’s fast and doesn’t send data to the cloud. -
reply(to:)
is the method that sends your question to the AI. Theawait
keyword is used because AI processing takes a little time, and Swift handles this asynchronously (in the background) so your app doesn’t freeze. -
The response from the AI is stored in response
, andresponse.content
gives you the actual text of the answer.
It’s that simple! With just a few lines of code, you can get answers from an AI model.
Step 3: Getting Structured Data
Sometimes, you don’t want just a plain text answer—you want data that your app can work with directly, like a list of facts or a set of values. SwiftAI makes this easy with structured responses.
For example, let’s say you want information about a city, like its name, country, and population. You can define a structure (a “struct” in Swift) for this data, and SwiftAI will make sure the AI returns exactly that structure.
Here’s how:
// Define the structure you want back
@Generable
struct CityInfo {
let name: String
let country: String
let population: Int
}
let response = try await llm.reply(
to: "Tell me about Tokyo",
returning: CityInfo.self // Tell the LLM what to output
)
let cityInfo = response.content
print(cityInfo.name) // "Tokyo"
print(cityInfo.country) // "Japan"
print(cityInfo.population) // 13960000
What’s happening here?
-
The @Generable
attribute tells SwiftAI that this struct can be created by the AI. It’s like giving the AI a blueprint of what you need. -
returning: CityInfo.self
tells the AI that you want aCityInfo
object instead of a plain string. -
The AI then generates a response that fits this structure, and SwiftAI converts it automatically—no need for you to parse messy JSON or text.
This is a game-changer because it ensures the data your app uses is always in the right format. If the AI can’t generate valid data (for example, if it can’t find the population), SwiftAI will give you an error instead of letting bad data break your app.
Step 4: Letting AI Use Tools
AI models are great at understanding language, but they don’t always have real-time information (like the current weather). That’s where tools come in. SwiftAI lets you create tools (like a weather checker) that the AI can use when it needs specific information.
Here’s an example of a weather tool:
// Create a tool the AI can use
struct WeatherTool: Tool {
let description = "Get current weather for a city"
@Generable
struct Arguments {
let city: String
}
func call(arguments: Arguments) async throws -> String {
// Your weather API logic here (e.g., connect to a weather service)
return "It's 72°F and sunny in \(arguments.city)"
}
}
// Use the tool with your AI
let weatherTool = WeatherTool()
let response = try await llm.reply(
to: "What's the weather like in San Francisco?",
tools: [weatherTool]
)
print(response.content) // "Based on current data, it's 72°F and sunny in San Francisco"
How does this work?
-
The WeatherTool
struct follows theTool
protocol, which means it can be used by the AI. Thedescription
tells the AI what the tool does (so it knows when to use it). -
The Arguments
struct defines what information the tool needs—in this case, the name of a city. -
The call
method is where the tool does its work. Here, it could connect to a real weather API, but for simplicity, we’re returning a sample response. -
When you pass tools: [weatherTool]
toreply(to:)
, the AI reads the tool’s description and decides if it needs to use it. For the question about San Francisco’s weather, it knows to use theWeatherTool
to get the current conditions.
The best part? You don’t have to tell the AI when to use the tool—it figures it out on its own. This makes your app smarter and more self-sufficient.
Step 5: Switching Between AI Models
Different AI models have different strengths. For example, Apple’s on-device model is fast and private, but a cloud model like OpenAI might be better for complex tasks. SwiftAI makes it easy to switch between models without changing your code.
Here’s how you can set up your app to use the best available model:
// Choose your model based on availability
let llm: any LLM = {
let systemLLM = SystemLLM()
return systemLLM.isAvailable ? systemLLM : OpenaiLLM(apiKey: "your-api-key")
}()
// Same code works with any model
let response = try await llm.reply(to: "Write a haiku about Berlin.")
print(response.content)
What’s happening here?
-
SystemLLM
is Apple’s on-device model. It’s private (your data stays on your device), fast, and free to use. -
OpenaiLLM
uses OpenAI’s cloud service. It’s more powerful for complex tasks but requires an API key (and may have costs). -
isAvailable
checks if the on-device model is ready to use (some older devices might not support it). -
The llm
variable is defined asany LLM
, which means it can be any AI model that follows SwiftAI’sLLM
protocol. This is why the samereply(to:)
method works no matter which model you use.
This flexibility lets you optimize your app for different situations: use the on-device model for quick, private tasks, and switch to a cloud model when you need more power.
Step 6: Having Conversations
If you’re building a chat app, you need the AI to remember what was said earlier. SwiftAI’s Chat
feature handles this by keeping track of the conversation history.
// Create a chat with tools
let chat = try Chat(with: llm, tools: [weatherTool])
// Have a conversation
let greeting = try await chat.send("Hello! I'm planning a trip.")
let advice = try await chat.send("What should I pack for Seattle?")
// The AI remembers context from previous messages
Here’s why this matters:
-
Chat
stores all previous messages, so the AI knows the context of the conversation. For example, if you first say you’re planning a trip, then ask about packing for Seattle, the AI will connect the two and suggest weather-appropriate clothes. -
send()
works likereply(to:)
but adds each message to the conversation history. -
You can still use tools in conversations, so the AI can get real-time info (like Seattle’s weather) to give better advice.
This makes conversations feel natural, just like talking to a real person who remembers what you’ve discussed.
Step 7: Adding Constraints to AI Outputs
Sometimes, you need the AI to follow specific rules when generating data. For example, a username might need to start with a letter, or an age should be between 13 and 120. SwiftAI lets you add these constraints with the @Guide
attribute.
@Generable
struct UserProfile {
@Guide(description: "A valid username starting with a letter", .pattern("^[a-zA-Z][a-zA-Z0-9_]{2,}$"))
let username: String
@Guide(description: "User age in years", .minimum(13), .maximum(120))
let age: Int
@Guide(description: "One to three favorite colors", .minimumCount(1), .maximumCount(3))
let favoriteColors: [String]
}
Let’s see what each part does:
-
@Guide
adds rules and descriptions to help the AI generate correct data. -
.pattern("^[a-zA-Z][a-zA-Z0-9_]{2,}$")
is a regex (a pattern) that ensures usernames start with a letter and have at least 3 characters (letters, numbers, or underscores). -
.minimum(13)
and.maximum(120)
make sure the age is within a reasonable range. -
.minimumCount(1)
and.maximumCount(3)
ensure the list of favorite colors has between 1 and 3 items.
These constraints help the AI generate data that fits your app’s needs, reducing errors and making your code more reliable.
Quick Reference: What to Use When
Here’s a simple guide to help you choose the right SwiftAI feature for what you need to do:
Supported AI Models
SwiftAI works with several types of AI models, each with its own pros and cons. Here’s a breakdown:
-
SystemLLM: Runs directly on your Apple device (iPhone, iPad, Mac). It’s private because your data doesn’t leave the device, and it’s fast. It’s a good choice for simple tasks and when privacy is important.
-
OpenaiLLM: Uses OpenAI’s cloud servers. It’s more powerful and can handle complex tasks, but your data is sent to OpenAI’s servers, and you’ll need an API key (which may have associated costs).
-
CustomLLM: If you have your own AI model or backend, you can create a
CustomLLM
to use with SwiftAI. This gives you full control over privacy, capabilities, and cost.
Examples of SwiftAI in Action
If you want to see SwiftAI in use, check out the Examples/
directory in the SwiftAI repository. These sample apps show how to implement different features, from simple queries to full conversations with tools. They’re a great way to learn by example.
What’s Coming Next?
SwiftAI is still in alpha (early development), so some features are still being worked on. Here’s the current status of features compared to Apple’s FoundationModels SDK:
The SwiftAI team is working on these features, so keep an eye on the issue links for updates.
Contributing to SwiftAI
SwiftAI is an open-source project, and contributions from developers like you are welcome! Here’s how you can get started:
Development Setup
-
Clone the repository: git clone https://github.com/mi12labs/SwiftAI.git
-
Navigate to the project directory: cd SwiftAI
-
Build the project: swift build
-
Run the tests to make sure everything works: swift test
Before contributing, be sure to read the Contributing Guidelines to learn about the project’s standards and process.
License
SwiftAI is released under the MIT License. This means you can use, modify, and distribute the library for both personal and commercial projects, as long as you include the original license notice. You can find the full license details in the LICENSE file.
Note: SwiftAI is in Alpha
It’s important to remember that SwiftAI is currently in alpha. This means there might be bugs, and some features could change in future updates. It’s a good idea to test thoroughly before using it in production apps, but it’s still a great tool to experiment with and contribute to.
SwiftAI is built with care for the Swift community, aiming to make AI accessible to all developers. Whether you’re a beginner or an experienced Swift programmer, SwiftAI’s simple, unified interface can help you add powerful AI features to your apps with less code and fewer headaches. Give it a try, and see what you can build!