Experiment Manager - Design and run experiments to train and compare deep

  learning networks - MATLAB ([original](https://www.mathworks.com/help/deeplearning/ref/experimentmanager-app.html)) ([raw](?raw))

Design and run experiments to train and compare deep learning networks

Since R2020a

Description

You can use the Experiment Manager app to create deep learning experiments to train networks under different training conditions and compare the results. For example, you can use Experiment Manager to:

To set up your experiment quickly, you can start with a preconfigured template. The experiment templates support workflows that include image classification and regression, sequence classification, audio classification, signal processing, semantic segmentation, and custom training loops.

The Experiment Browser panel displays the hierarchy of experiments and results in a project. The icon next to the experiment name indicates its type.

This page contains information about built-in and custom training experiments for Deep Learning Toolbox™. For general information about using the app, see Experiment Manager. For information about using Experiment Manager with the Classification Learner and Regression Learner apps, see Experiment Manager (Statistics and Machine Learning Toolbox).

More

Required Products

Experiment Manager app

Open the Experiment Manager App

For general information about using the app, see Experiment Manager.

Examples

expand all

Quickly Set Up Experiment Using Preconfigured Template

Quickly set up an experiment using a preconfigured experiment template.

Open the Experiment Manager app. In the dialog box, you can create a new project or open an example from the documentation. Under New, select Blank Project.

In the next dialog box, you can open a blank experiment template or one of the preconfigured experiment templates to support your AI workflow. For example, underImage Classification Experiments, select the preconfigured template Image Classification by Sweeping Hyperparameters.

Experiment Manager dialog box with blank experiment templates and preconfigured experiment templates

Specify the name and location for the new project. Experiment Manager opens a new experiment in the project.

The experiment is a built-in training experiment that uses the trainnet training function, indicated by the Blue Erlenmeyer flask icon.

The experiment definition tab displays the description, hyperparameters, setup function, post-training custom metrics, and supporting files that define the experiment. You can modify these parameters to quickly set up your experiment, and then run the experiment.

For information about how to run the experiment and compare results after you configure the experiment parameters, see Experiment Manager.

Experiment definition tab for the experiment created using the preconfigured image classification template

Train Network Using trainnet and Display Custom Metrics

Set up an experiment that trains using the trainnet function and exhaustive hyperparameter sweep. Built-in training experiments support workflows such as image, sequence, time-series, or feature classification and regression.

Open the Experiment Manager app. In the dialog box, you can create a new project or open an example from the documentation. Under New, select Blank Project.

In the next dialog box, you can open a blank experiment template or one of the preconfigured experiment templates to support your AI workflow. Under Blank Experiments, select the blank template Built-In Training (trainnet).

The experiment is a built-in training experiment that uses thetrainnet training function, indicated by the Blue Erlenmeyer flask icon.

The experiment definition tab displays the description, hyperparameters, setup function, post-training custom metrics, and supporting files that define the experiment. When starting with a blank experiment template, you must manually configure these parameters. If you prefer a template with some preconfigured parameters, select one of the preconfigured built-in training templates instead from the Experiment Manager dialog box.

Experiment definition tab showing the default configuration for a built-in training experiment

Configure the experiment parameters.

For information about how to run the experiment and compare results after you configure the experiment parameters, see Experiment Manager.

Optimize Training Using trainnet and Bayesian Optimization

Set up an experiment that trains using the trainnet function and Bayesian optimization. Built-in training experiments support workflows such as image, sequence, time-series, or feature classification and regression.

Open the Experiment Manager app. In the dialog box, you can create a new project or open an example from the documentation. Under New, select Blank Project.

In the next dialog box, you can open a blank experiment template or one of the preconfigured experiment templates to support your AI workflow. Under Blank Experiments, select the blank template Built-In Training (trainnet).

The experiment is a built-in training experiment that uses thetrainnet training function, indicated by the Blue Erlenmeyer flask icon.

The experiment definition tab displays the description, hyperparameters, setup function, post-training custom metrics, and supporting files that define the experiment. When starting with a blank experiment template, you must manually configure these parameters. If you prefer a template with some preconfigured parameters, select one of the preconfigured built-in training templates instead from the Experiment Manager dialog box.

Experiment definition tab showing the default configuration for a built-in training experiment

Configure the experiment parameters.

For information about how to run the experiment and compare results after you configure the experiment parameters, see Experiment Manager.

Train Network Using Custom Training Loop and Display Visualization

Set up an experiment that trains using a custom training function and creates custom visualizations.

Custom training experiments support workflows that require a training function other than trainnet. These workflows include:

Open the Experiment Manager app. In the dialog box, you can create a new project or open an example from the documentation. Under New, select Blank Project.

In the next dialog box, you can open a blank experiment template or one of the preconfigured experiment templates to support your AI workflow. Under Blank Experiments, select the blank template Custom Training.

The experiment is a custom training experiment that uses a custom training function, indicated by the Purple beaker icon.

The experiment definition tab displays the description, hyperparameters, training function, and supporting files that define the experiment. When starting with a blank experiment template, you must manually configure these parameters. If you prefer a template with some preconfigured parameters, select one of the preconfigured custom training templates instead from the Experiment Manager dialog box.

Experiment definition tab showing the default configuration for a custom training experiment

Configure the experiment parameters.

For information about how to run the experiment and compare results after you configure the experiment parameters, see Experiment Manager.

Run Experiment Trials in Parallel

You can decrease the run time of some experiments if you have Parallel Computing Toolbox or MATLAB Parallel Server.

By default, Experiment Manager runs one trial at a time. If you have Parallel Computing Toolbox, you can run multiple trials at the same time or run a single trial on multiple GPUs, on a cluster, or in the cloud. If you have MATLAB Parallel Server, you can also offload experiments as batch jobs in a remote cluster so that you can continue working or close your MATLAB session while your experiment runs.

In the Experiment Manager toolstrip, in theExecution section, use the Mode list to specify an execution mode. If you select the Batch Sequential or Batch Simultaneous execution mode, use theCluster list and Pool Size field in the toolstrip to specify your cluster and pool size.

For more information, see Run Experiments in Parallel orOffload Experiments as Batch Jobs to a Cluster.

More About

expand all

Exhaustive Sweep

To sweep through a range of hyperparameter values, setStrategy to Exhaustive Sweep. In theHyperparameters table, enter the names and values of the hyperparameters used in the experiment. Hyperparameter names must start with a letter, followed by letters, digits, or underscores. Hyperparameter values must be scalars or vectors with numeric, logical, or string values, or cell arrays of character vectors. For example, these values are valid hyperparameters:

Experiment Manager trains the network using every combination of the hyperparameter values specified in the table.

Bayesian Optimization

With Statistics and Machine Learning Toolbox, find optimal training options by using Bayesian optimization. SetStrategy to Bayesian Optimization. When you run the experiment, Experiment Manager searches for the best combination of hyperparameters. Each trial in the experiment uses a new combination of hyperparameter values based on the results of the previous trials.

In the Hyperparameters table, specify these properties of the hyperparameters used in the experiment:

To specify the duration of your experiment, under Bayesian Optimization Options, enter the maximum time in seconds and the maximum number of trials to run. Note that the actual run time and number of trials in your experiment can exceed these settings because Experiment Manager checks these options only when a trial finishes executing.

Optionally, specify deterministic constraints, conditional constraints, and an acquisition function for the Bayesian optimization algorithm (since R2023a). Under Bayesian Optimization Options, click Advanced Options and specify:

Setup Function Signatures

This table lists the supported signatures for the setup function for a built-in training experiment.

Goal of Experiment Setup Function Signature
Train a network for image classification and regression tasks using the images and responses specified by images and the training options defined by options. function [images,layers,lossFcn,options] = Experiment_setup(params) ... end
Train a network using the images specified by images and responses specified by responses. function [images,responses,layers,lossFcn,options] = Experiment_setup(params) ... end
Train a network for sequence or time-series classification and regression tasks (for example, an LSTM or GRU network) using the sequences and responses specified by sequences. function [sequences,layers,lossFcn,options] = Experiment_setup(params) ... end
Train a network using the sequences specified bysequences and responses specified byresponses. function [sequences,responses,layers,lossFcn,options] = Experiment_setup(params) ... end
Train a network for feature classification or regression tasks (for example, a multilayer perceptron, or MLP, network) using the feature data and responses specified by features. function [features,layers,lossFcn,options] = Experiment_setup(params) ... end
Train a network using the feature data specified byfeatures and responses specified byresponses. function [features,responses,layers,lossFcn,options] = Experiment_setup(params) ... end

Tips

Version History

Introduced in R2020a

expand all

R2024b: Improvements to experiment setup

These templates now include an initialization function that configures data or other experiment details before initiating the trial runs to reduce trial runtime. These templates also incorporate suggested hyperparameters.

R2024b: Set up signal classification experiment using transfer learning with preconfigured template

If you have Signal Processing Toolbox™, you can set up your built-in experiment for signal classification using transfer learning by selecting a preconfigured template.

R2024a: Set up signal processing experiments with preconfigured templates

If you have Signal Processing Toolbox, you can set up your built-in or custom training experiments for signal classification by selecting a preconfigured template. Using these templates, you can perform:

R2023b: App available in MATLAB

You can now use Experiment Manager in MATLAB, with or without Deep Learning Toolbox. When you share your experiments with colleagues who do not have aDeep Learning Toolbox license, they can open your experiments and access your results.Experiment Manager requires:

R2023b: Delete multiple experiments and results

Use the Experiment Browser to delete multiple experiments or multiple results from a project in a single operation. Select the experiments or results you want to delete, then right-click and select Delete. Your selection must contain only experiments or only results. If you delete an experiment,Experiment Manager also deletes the results contained in the experiment.

R2023a: Visualizations for custom training experiments

Display visualizations for your custom training experiments directly in the Experiment Manager app. When the training is complete, the Review Results gallery in the toolstrip displays a button for each figure that you create in your training function. To display a figure in the Visualizations panel, click the corresponding button in the Custom Plot section of the gallery.

R2023a: Debug code before or after running experiment

Diagnose problems in your experiment directly from the Experiment Manager app.

For more information, see Debug Deep Learning Experiments.

R2023a: Ease-of-use enhancements

R2022b: Ease-of-use enhancements

R2022a: Experiments as batch jobs in a cluster

If you have Parallel Computing Toolbox and MATLAB Parallel Server, you can send your experiment as a batch job to a remote cluster. If you have only Parallel Computing Toolbox, you can use a local cluster profile to develop and test your experiments on your client machine instead of running them on a network cluster. For more information, seeOffload Experiments as Batch Jobs to a Cluster.

R2022a: Ease-of-use enhancements

R2021b: Bayesian optimization in custom training experiments

If you have Statistics and Machine Learning Toolbox, you can use Bayesian optimization to determine the best combination of hyperparameters for a custom training experiment. Previously, custom training experiments supported only sweeping hyperparameters. For more information, see Use Bayesian Optimization in Custom Training Experiments.

R2021b: Experiments in MATLAB Online

Run Experiment Manager in your web browser by using MATLAB Online™. For parallel execution of experiments, you must have access to a Cloud Center cluster.

R2021b: Ease-of-use enhancements

R2021a: Custom training experiments

Create custom training experiments to support workflows such as:

R2021a: Ease-of-use enhancements

R2020b: Parallel execution

If you have Parallel Computing Toolbox, you can run multiple trials of an experiment at the same time by clickingUse Parallel and then Run. Experiment Manager starts the parallel pool and executes multiple simultaneous trials. For more information, see Run Experiments in Parallel.

See Also

Apps

Functions

Objects

Topics