Take and organize notes like text messages.
Machine Learning has been subject to great hype within the tech industry; sparking great interest amongst seasoned and aspiring engineers alike. In spite of the large barriers to entry, high lveel frameworks like Keras, SciKit-Learn, and IBM Watson help democratize artificial intelligence to the masses. The indtroduction of Core ML by Apple in 2017, allowed developers who aren't necessarily experts in the area of machine learning to apply such technologies into their apps.
However, Core ML is not a machine learning framework, it is a framework that allow developers to apply machine learning models into their apps. While it has shown great promise for the future of machine learning in mobile devoces, it lacked the ability to improve overtime. In Apple’s annual world wide developer conference in 2019 changed that with the introduction to Core ML 3.
Core ML 3 is probably the most underrated new releases in WWDC 2019; largely dwarfed by other features and frameworks released like dark mode, iPadOS, and SwiftUI. Nonetheless, it still deserves some attention for what it can bring to future apps. Core ML 3 now gives developers the ability to generate updatable machine learning models. In other words, on-device model training. You can still use popular machine learning libraries like Keras and Turi Create to generate these models. To create an updatable Core ML Model, you will need CoreMLTools to do that for you.
There are 6 general steps to create an updatable neural network model. From an existing model, we:
I will demonstrate how to apply these steps on an MNIST handwritten digit MLModel with CoreMLTools. This article will be broken up into 2 parts
Part 1 - Generating an updatable MLModel
Part 2 - Applying the updatable MLModel (coming soon)
To follow, you must have the following python dependency and files prepared:
specs = coremltools.utils.load_spec('mnist_mlmodel_path')
builder = coremltools.models.neural_network.NeuralNetworkBuilder(spec=specs)
nn_spec, you'll see the properties of your neural network. We want to set the specs for training inputs and outputs the same as the specs of our original model for on-device training.
nn_spec = builder.spec nn_spec.description.trainingInput.extend([nn_spec.description.input]) nn_spec.description.trainingInput.shortDescription = 'Example of handwritten digit' nn_spec.description.trainingInput.extend([nn_spec.description.output]) nn_spec.description.trainingInput.shortDescription = 'Associated true label of example image'
You can view your updatable layers with
inspect_updatable_layers(). For this example,
dense_2 are updatable layers.
MNIST dataset, we can use the
categorical_cross_entropy loss function.
# Set the loss function builder.set_categorical_cross_entropy_loss(name='loss_layer', input='prediction', target='trueDigit') # Optimizer from coremltools.models.neural_network import SgdParams as sgd builder.set_sgd_optimizer(sgd(lr=0.01, batch=32)) # number of epochs builder.set_epochs(50)
from coremltools.models import MLModel updatable_mlmodel = MLModel(nn_spec) updatable_mlmodel.save('./UpdatableMNIST.mlmodel')
If you open your new MLModel, you'll see two new sections
Update section describes the type of inputs and outputs needed to update the model. The
Parameters describes the parameters in which the model will train on.