Deep studying has revolutionized machine studying. However designing the fitting neural community — layers, neurons, activation features, optimizers — can really feel like an infinite experiment. Wouldn’t or not it’s good if somebody did the heavy lifting for you?
That’s precisely what AutoML for deep studying goals to resolve.
On this article, I’ll introduce you to 2 highly effective but approachable instruments for automating deep studying: AutoKeras and Keras Tuner. We are going to deep dive into these libraries and do some hands-on mannequin constructing.
Why automate deep studying?
Deep studying mannequin design and hyperparameter tuning are resource-intensive. It’s simple to:
- Overfit by utilizing too many parameters.
- Waste time testing architectures manually.
- Miss better-performing configurations.
AutoML instruments take away a lot of the guesswork by automating structure search and tuning.
How do these libraries work?
AutoKeras
AutoKeras leverages Neural Structure Search (NAS) methods behind the scenes. It makes use of a trial-and-error strategy powered by Keras Tuner beneath the hood to check totally different configurations. As soon as candidate is discovered, it trains it to convergence and evaluates.
Keras Tuner
Keras Tuner is concentrated on hyperparameter optimization. You outline the search area (e.g., variety of layers, variety of items, studying charges), and it makes use of optimization algorithms (random search, Bayesian optimization, Hyperband) to search out the very best configuration.
Putting in required libraries
Putting in these libraries is sort of simple; we simply want to make use of pip. We are able to run the command under within the Jupyter pocket book to put in each libraries.
pip set up autokeras
pip set up keras-tuner
AutoKeras: Finish-to-end automated deep studying
AutoKeras is a high-level library constructed on prime of TensorFlow and Keras. It automates:
- Neural structure search (NAS)
- Hyperparameter tuning
- Mannequin coaching
With just some traces of code, you’ll be able to prepare deep studying fashions for pictures, textual content, tabular, and time-series information.
Creating the mannequin
For this text, we’ll work on picture classification. We are going to load the MNIST dataset utilizing Tensorflow Datasets. This dataset is freely obtainable on Tensorflow, you’ll be able to test it out here, after which we’ll use the ImageClassifier of Autokeras.
import autokeras as ak
from tensorflow.keras.datasets import mnist
(x_train, y_train), (x_test, y_test) = mnist.load_data()
clf = ak.ImageClassifier(max_trials=3) # Attempt 3 totally different fashions
clf.match(x_train, y_train, epochs=10)
accuracy = clf.consider(x_test, y_test)
print("Take a look at accuracy:", accuracy)
We are able to see within the output screenshot that it took 42 minutes for trial 2, and it’ll proceed until 3 trials after which present us the very best mannequin with parameters.
Keras Tuner: Versatile hyperparameter optimization
Keras Tuner, developed by the TensorFlow workforce, is a hyperparameter optimization library. In contrast to AutoKeras, it doesn’t design architectures from scratch — as an alternative, it tunes the hyperparameters of the structure you outline.
Creating the mannequin
In contrast to Auto Keras, right here we should create the complete mannequin. We are going to use the identical Mnist Picture Dataset and create a CNN picture classifier mannequin.
import tensorflow as tf
import keras_tuner as kt
(x_train, y_train), (x_test, y_test) = tf.keras.datasets.mnist.load_data()
x_train, x_test = x_train / 255.0, x_test / 255.0
def build_model(hp):
mannequin = tf.keras.Sequential()
mannequin.add(tf.keras.layers.Flatten())
mannequin.add(tf.keras.layers.Dense(hp.Int('items', 32, 512, step=32), activation='relu'))
mannequin.add(tf.keras.layers.Dense(10, activation='softmax'))
mannequin.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
return mannequin
tuner = kt.Hyperband(build_model, goal='val_accuracy', max_epochs=10)
tuner.search(x_train, y_train, epochs=10, validation_split=0.2)

So, within the output, we will clearly see that trial 21 accomplished in 22 seconds with a validation accuracy of ~97.29% and the mannequin additionally retains the very best accuracy which is ~97.93%.
To search out out the very best mannequin, we will run the instructions given under.
fashions = tuner.get_best_models(num_models=2)
best_model = fashions[0]
best_model.abstract()

We are able to additionally discover out the Prime 10 trials that Keras Tuner carried out utilizing the command under.
tuner.results_summary()

Actual life use-case
A wonderful instance of how we will use each these libraries is A telecom firm that wishes to foretell buyer churn utilizing structured buyer information. Their information science workforce makes use of AutoKeras to rapidly prepare a mannequin on tabular options with out writing complicated structure code. Later, they use Keras Tuner to fine-tune a customized neural community that features area data options. This hybrid strategy saves weeks of experimentation and improves mannequin efficiency.
Conclusion
Each AutoKeras and Keras Tuner make deep studying extra accessible and environment friendly.
Use AutoKeras if you desire a fast, end-to-end mannequin with out worrying about structure. Use Keras Tuner when you have already got a good suggestion of your structure however wish to squeeze out the very best efficiency by hyperparameter tuning.
Automating components of deep studying frees you as much as concentrate on understanding your information and deciphering outcomes, which is the place the actual worth lies.