Swift SDK for Microsoft's Custom Vision Service
UIKit
& Foundation
objects like UIImage
CoreML
models for use offlineCarthage is a decentralized dependency manager that builds your dependencies and provides you with binary frameworks.
You can install Carthage with Homebrew using the following command:
$ brew update
...
$ brew install carthage
To integrate CustomVision into your Xcode project using Carthage, specify it in your Cartfile:
github "colbylwilliams/CustomVision"
Run carthage update
to build the framework and drag the built CustomVision.framework
into your Xcode project.
CocoaPods is a dependency manager for Cocoa projects. You can install it with the following command:
[sudo] gem install cocoapods
CocoaPods 1.3+ is required.
To integrate the Azure.iOS into your project, specify it in your Podfile:
source 'https://github.com/CocoaPods/Specs.git'
platform :ios, '12.0'
use_frameworks!
pod 'CustomVision', '~> 1.0'
Then, run the following command:
$ pod install
...
Coming soon
To get started using CustomVision, you need to provide the SDK with your Training Key and Prediction Key.
If you're working with a single project, you can also provide a default Project ID that will be used for every project operation (instead of passing it in as a parameter every time).
There are two ways to provide the Training Key, Prediction Key, and Project ID; programmatically, or by adding them to a plist file:
The simplest way to provide these values and start using the SDK is to set the values programmatically:
CustomVisionClient.defaultProjectId = "CUSTOM_VISION_PROJECT_ID"
CustomVisionClient.subscriptionRegion = "CUSTOM_VISION_PROJECT_ID"
CustomVisionClient.shared.trainingKey = "CUSTOM_VISION_TRAINING_KEY"
CustomVisionClient.shared.predictionKey = "CUSTOM_VISION_PREDICTION_KEY"
CustomVisionClient.shared.getIterations { r in
// r.resource is [Iteration]
}
Alternatively, you can provide these values in your project's info.plist
, a separate CustomVision.plist
, or provide the name of your own plist file to use.
Simply add the CustomVisionTrainingKey
, CustomVisionPredictionKey
, CustomVisionProjectId
, and CustomVisionSubscriptionRegion
keys and provide your Training Key, Prediction Key, and default Project ID respectively.
Note: This method is provided for convenience when quickly developing samples and is not recommended to ship these values in a plist in production apps.
...
<dict>
<key>CFBundleName</key>
<string>$(PRODUCT_NAME)</string>
<key>CustomVisionProjectId</key>
<string>CUSTOM_VISION_PROJECT_ID</string>
<key>CustomVisionTrainingKey</key>
<string>CUSTOM_VISION_TRAINING_KEY</string>
<key>CustomVisionPredictionKey</key>
<string>CUSTOM_VISION_PREDICTION_KEY</string>
<key>CustomVisionSubscriptionRegion</key>
<string>CUSTOM_VISION_SUBSCRIPTION_REGION</string>
...
Or add a CustomVision.plist
file.
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>CustomVisionProjectId</key>
<string>CUSTOM_VISION_PROJECT_ID</string>
<key>CustomVisionTrainingKey</key>
<string>CUSTOM_VISION_TRAINING_KEY</string>
<key>CustomVisionPredictionKey</key>
<string>CUSTOM_VISION_PREDICTION_KEY</string>
<key>CustomVisionSubscriptionRegion</key>
<string>CUSTOM_VISION_SUBSCRIPTION_REGION</string>
</dict>
</plist>
Finally, you can CustomVisionTrainingKey
, CustomVisionPredictionKey
, CustomVisionProjectId
, and CustomVisionSubscriptionRegion
key/values to any plist in your project's main bundle and provide the name of the plist:
CustomVisionClient.shared.getKeysFrom(plistNamed: "SuperDuperDope")
The CustomVision SDK adds several convenience functions to make uploading new training images to your project as easy as possible.
This example demonstrates creating a new Tag in the Custom Vision project, then uploading several new training images to the project, tagging each with the newly created tag:
let client = CustomVisionClient.shared
let tag = "New Tag" // doesn't exist in project yet
let images: [UIImage] = // several UIImages
client.createImages(from: images, withNewTagNamed: name) { r in
// r.resource is ImageCreateSummary
}
One of the most useful features of this SDK is the ability to re-train your Project, export the newly trained model (Iteration), download to the phone's file system, and compile the model on-device for use with CoreML
.
func updateUser(message: String) {
// update user
}
let client = CustomVisionClient.shared
client.trainAndDownloadCoreMLModel(withName: "myModel", progressUpdate: updateUser) { (success, message) in
}
Once the compiled model is persisted to the devices filesystem (above) you get the url of the model using the client's getModelUrl()
func:
if let url = client.getModelUrl() {
let myModel = try MLModel(contentsOf: url)
}
Licensed under the MIT License. See LICENSE for details.