runperf - Run set of tests for performance measurement - MATLAB (original) (raw)

Run set of tests for performance measurement

Syntax

Description

results = runperf runs all the tests in your current folder for performance measurements and returns an array of matlab.perftest.TimeResult objects. Each element inresults corresponds to an element in the test suite.

The performance testing framework runs the tests using a variable number of measurements to reach a sample mean with a 0.05 (5%) relative margin of error within a 0.95 (95%) confidence level. It runs the tests 5 times to warm up the code, and then between 4 and 256 times to collect measurements that meet the statistical objectives. If the sample mean does not meet the 0.05 relative margin of error within a 0.95 confidence level after 256 test runs, then the framework stops running the test and displays a warning. In this case, the TimeResult object contains information for the 5 warm-up runs and 256 measurement runs.

The runperf function provides a simple way to run a collection of tests as a performance experiment.

results = runperf([tests](#bu4b3vz-tests)) runs a specified set of tests.

example

results = runperf(___,[Name,Value](#namevaluepairarguments)) runs a set of tests with additional options specified by one or more name-value arguments.

example

Examples

collapse all

In your current folder, create a script-based test,onesTest.m, that uses three different methods to initialize a 3000-by-1000 matrix of ones.

rows = 3000; cols = 1000;

%% Ones Function X = ones(rows,cols);

%% Loop Assignment Without Preallocation for r = 1:rows for c = 1:cols X(r,c) = 1; end end

%% Loop Assignment With Preallocation X = zeros(rows,cols); for r = 1:rows for c = 1:cols X(r,c) = 1; end end

Run the script as a performance test. The returnedresults variable is a 1-by-3TimeResult array. Each element in the array corresponds to one of the tests defined inonesTest.m.

results = runperf("onesTest")

Running onesTest .......... .......... ....... Done onesTest


results =

1×3 TimeResult array with properties:

Name
Valid
Samples
TestActivity

Totals: 3 Valid, 0 Invalid. 23.1678 seconds testing time.

Display the measurement results for the second test, which loops the assignment without preallocation.

ans =

TimeResult with properties:

        Name: 'onesTest/LoopAssignmentWithoutPreallocation'
       Valid: 1
     Samples: [4×7 table]
TestActivity: [9×12 table]

Totals: 1 Valid, 0 Invalid. 22.8078 seconds testing time.

Display the complete table of test measurements. The performance testing framework ran five warm-up runs, followed by four measurement runs (indicated as sample in the Objective column). Your results might vary.

ans =

9×12 table

                   Name                        Passed    Failed    Incomplete    MeasuredTime    Objective         Timestamp             Host        Platform                 Version                            TestResult                         RunIdentifier            
___________________________________________    ______    ______    __________    ____________    _________    ____________________    ___________    ________    __________________________________    ______________________________    ____________________________________

onesTest/LoopAssignmentWithoutPreallocation    true      false       false          2.5463        warmup      14-Oct-2022 13:51:36    MY-HOSTNAME     win64      9.14.0.2081372 (R2023a) Prerelease    1×1 matlab.unittest.TestResult    47ea2cab-5c34-4393-ba91-9715fb919d9b
onesTest/LoopAssignmentWithoutPreallocation    true      false       false          2.5294        warmup      14-Oct-2022 13:51:38    MY-HOSTNAME     win64      9.14.0.2081372 (R2023a) Prerelease    1×1 matlab.unittest.TestResult    47ea2cab-5c34-4393-ba91-9715fb919d9b
onesTest/LoopAssignmentWithoutPreallocation    true      false       false          2.4956        warmup      14-Oct-2022 13:51:41    MY-HOSTNAME     win64      9.14.0.2081372 (R2023a) Prerelease    1×1 matlab.unittest.TestResult    47ea2cab-5c34-4393-ba91-9715fb919d9b
onesTest/LoopAssignmentWithoutPreallocation    true      false       false          2.5369        warmup      14-Oct-2022 13:51:43    MY-HOSTNAME     win64      9.14.0.2081372 (R2023a) Prerelease    1×1 matlab.unittest.TestResult    47ea2cab-5c34-4393-ba91-9715fb919d9b
onesTest/LoopAssignmentWithoutPreallocation    true      false       false           2.535        warmup      14-Oct-2022 13:51:46    MY-HOSTNAME     win64      9.14.0.2081372 (R2023a) Prerelease    1×1 matlab.unittest.TestResult    47ea2cab-5c34-4393-ba91-9715fb919d9b
onesTest/LoopAssignmentWithoutPreallocation    true      false       false          2.5856        sample      14-Oct-2022 13:51:49    MY-HOSTNAME     win64      9.14.0.2081372 (R2023a) Prerelease    1×1 matlab.unittest.TestResult    47ea2cab-5c34-4393-ba91-9715fb919d9b
onesTest/LoopAssignmentWithoutPreallocation    true      false       false          2.5344        sample      14-Oct-2022 13:51:51    MY-HOSTNAME     win64      9.14.0.2081372 (R2023a) Prerelease    1×1 matlab.unittest.TestResult    47ea2cab-5c34-4393-ba91-9715fb919d9b
onesTest/LoopAssignmentWithoutPreallocation    true      false       false           2.542        sample      14-Oct-2022 13:51:54    MY-HOSTNAME     win64      9.14.0.2081372 (R2023a) Prerelease    1×1 matlab.unittest.TestResult    47ea2cab-5c34-4393-ba91-9715fb919d9b
onesTest/LoopAssignmentWithoutPreallocation    true      false       false          2.4653        sample      14-Oct-2022 13:51:56    MY-HOSTNAME     win64      9.14.0.2081372 (R2023a) Prerelease    1×1 matlab.unittest.TestResult    47ea2cab-5c34-4393-ba91-9715fb919d9b

Display the mean measured time for the second test. To exclude data collected in the warm-up runs, use the values in theSamples property.

mean(results(2).Samples.MeasuredTime)

To compare the different initialization methods in the script, create a table of summary statistics from results. In this example, the ones function was the fastest way to initialize the matrix to ones. The performance testing framework made four measurement runs for this test.

T = sampleSummary(results)

T =

3×7 table

                   Name                        SampleSize      Mean       StandardDeviation      Min        Median         Max   
___________________________________________    __________    _________    _________________    ________    _________    _________

onesTest/OnesFunction                              4         0.0052392       8.9302e-05        0.005171    0.0052078    0.0053703
onesTest/LoopAssignmentWithoutPreallocation        4            2.5318         0.049764          2.4653       2.5382       2.5856
onesTest/LoopAssignmentWithPreallocation           4          0.023947       0.00046027        0.023532     0.023921     0.024415

Compare the performance of various preallocation approaches by creating a test class that derives from matlab.perftest.TestCase.

In a file named preallocationTest.m in your current folder, create the preallocationTest test class. The class contains four Test methods that correspond to different approaches to creating a vector of ones. When you run any of these methods with the runperf function, the function measures the time it takes to run the code inside the method.

classdef preallocationTest < matlab.perftest.TestCase methods (Test) function testOnes(testCase) x = ones(1,1e7); end

    function testIndexingWithVariable(testCase)
        id = 1:1e7;
        x(id) = 1;
    end

    function testIndexingOnLHS(testCase)
        x(1:1e7) = 1;
    end

    function testForLoop(testCase)
        for i = 1:1e7
            x(i) = 1;
        end
    end
end

end

Run performance tests for all the tests with "Indexing" in their name. Your results might vary, and you might see a warning if runperf does not meet statistical objectives.

results = runperf("preallocationTest","Name","Indexing")

Running preallocationTest .......... .......... .......... .. Done preallocationTest


results = 1×2 TimeResult array with properties:

Name
Valid
Samples
TestActivity

Totals: 2 Valid, 0 Invalid. 3.011 seconds testing time.

To compare the preallocation methods, create a table of summary statistics from results. In this example, the testIndexingOnLHS method was the faster way to initialize the vector to ones.

T = sampleSummary(results)

T=2×7 table Name SampleSize Mean StandardDeviation Min Median Max
__________________________________________ __________ ________ _________________ ________ ________ ________

preallocationTest/testIndexingWithVariable        17          0.1223         0.014378         0.10003     0.12055     0.15075
preallocationTest/testIndexingOnLHS                5        0.027557        0.0013247        0.026187    0.027489    0.029403

Input Arguments

collapse all

Tests, specified as a string array, character vector, or cell array of character vectors. Use this argument to specify your test content. For example, you can specify a test file, a test class, a folder that contains test files, a namespace that contains test classes, or a project folder that contains test files.

Example: runperf("myTestFile.m")

Example: runperf(["myTestFile/test1" "myTestFile/test3"])

Example: runperf("myNamespace.MyTestClass")

Example: runperf(pwd)

Example: runperf({'myNamespace.MyTestClass','myTestFile.m',pwd,'myNamespace.innerNamespace'})

Example: runperf("C:\projects\project1")

Name-Value Arguments

expand all

Specify optional pairs of arguments asName1=Value1,...,NameN=ValueN, where Name is the argument name and Value is the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

Example: runperf(tests,Name="productA_*") runs test elements with a name that starts with "productA_".

Before R2021a, use commas to separate each name and value, and enclose Name in quotes.

Example: runperf(tests,"Name","productA_*") runs test elements with a name that starts with "productA_".

Test Identification

expand all

Option to run tests in subfolders, specified as a numeric or logical0 (false) or1 (true). By default, the framework runs tests in the specified folders, but not in their subfolders.

Option to run tests in inner namespaces, specified as a numeric or logical 0 (false) or1 (true). By default, the framework runs tests in the specified namespaces, but not in their inner namespaces.

Option to include tests from referenced projects, specified as a numeric or logical0 (false) or 1 (true). For more information on referenced projects, see Componentize Large Projects.

Action to take against an invalid test file in a folder or namespace that the function is processing, specified as one of these values:

An invalid test file is a test file that the framework cannot run. Examples include a test file that contains syntax errors, a function-based test file that is missing local functions, and a file with a Test method that is passed an undefined parameterization property.

Test Filtering

expand all

Name of the base folder that contains the test file, specified as a string array, character vector, or cell array of character vectors. This argument filters the test suite. For the testing framework to include a test in the filtered suite, the Test element must be contained in one of the base folders specified byBaseFolder. If none of theTest elements match a base folder, an empty test suite is returned. Use the wildcard character (*) to match any number of characters. Use the question mark character (?) to match a single character.

For test files defined in namespaces, the base folder is the parent of the top-level namespace folder.

Names of the files and folders that contain source code, specified as a string vector, character vector, or cell vector of character vectors. This argument filters the test suite by including only the tests that depend on the specified source code. If none of the tests depend on the source code, an empty test suite is returned.

The specified value must represent at least one existing file. If you specify a folder, the framework extracts the paths to the files within the folder.

You must have a MATLAB® Test™ license to use DependsOn. For more information about selecting tests by source code dependency, see matlabtest.selectors.DependsOn (MATLAB Test).

Example: DependsOn=["myFile.m" "myFolder"]

Example: DependsOn=["folderA" "C:\work\folderB"]

Name of the test, specified as a string array, character vector, or cell array of character vectors. This argument filters the test suite. For the testing framework to include a test in the filtered suite, the Name property of theTest element must match one of the names specified byName. If none of the Test elements have a matching name, an empty test suite is returned. Use the wildcard character (*) to match any number of characters. Use the question mark character (?) to match a single character.

For a given test file, the name of a test uniquely identifies the smallest runnable portion of the test content. The test name includes the namespace name, filename (excluding the extension), procedure name, and information about parameterization.

Name of a test class property that defines a parameter used by the test, specified as a string array, character vector, or cell array of character vectors. This argument filters the test suite. For the testing framework to include a test in the filtered suite, theParameterization property of the Test element must contain at least one of the property names specified byParameterProperty. If none of the Test elements have a matching property name, an empty test suite is returned. Use the wildcard character (*) to match any number of characters. Use the question mark character (?) to match a single character.

Name of a parameter used by the test, specified as a string array, character vector, or cell array of character vectors. MATLAB generates parameter names based on the test class property that defines the parameters. For example:

The ParameterName argument filters the test suite. For the testing framework to include a test in the filtered suite, theParameterization property of theTest element must contain at least one of the parameter names specified by ParameterName. If none of the Test elements have a matching parameter name, an empty test suite is returned. Use the wildcard character (*) to match any number of characters. Use the question mark character (?) to match a single character.

Name of the class that the test class derives from, specified as a string array, character vector, or cell array of character vectors. This argument filters the test suite. For the testing framework to include a test in the filtered suite, theTestClass property of the Test element must point to a test class that derives from one of the classes specified bySuperclass. If none of the Test elements match a class, an empty test suite is returned.

Name of a tag used by the test, specified as a string array, character vector, or cell array of character vectors. This argument filters the test suite. For the testing framework to include a test in the filtered suite, the Tags property of the Test element must contain at least one of the tag names specified by Tag. If none of the Test elements have a matching tag name, an empty test suite is returned. Use the wildcard character (*) to match any number of characters. Use the question mark character (?) to match a single character.

Tips

Version History

Introduced in R2016a

expand all

When you select function-based or class-based tests using theDependsOn name-value argument (requires MATLAB Test), the function more accurately selects tests that depend on the specified source code. If the function can determine which individual tests in the test file depend on the source code, then it selects only the dependent tests and excludes the rest. Otherwise, the function includes all the tests in the test file.

In previous releases, the function includes all the tests in a test file if the file depends on the specified source code, without attempting to exclude tests that are not dependent on the source code.

The IncludeSubpackages name-value argument is now named IncludeInnerNamespaces. The behavior remains the same, and existing instances of IncludeSubpackages in your code continue to work as expected. There are no plans to remove support for existing references to IncludeSubpackages.

If you have a MATLAB Test license, you can specify any type of source file using theDependsOn name-value argument. In previous releases, you can specify files only with a .m, .p,.mlx, .mlapp, .mat, or.slx extension.

You can filter a test suite by test file dependency on specified source code. Use theDependsOn name-value argument (requires MATLAB Test) to specify the source files and folders.

If you have Requirements Toolbox™ and MATLAB Test installed, you can use the runperf function to run tests that verify requirement sets. To run tests, specify one or more requirement set files as a string scalar or string vector. For example,results = runperf("myRequirementSet.slreqx") runs the tests that verify the specified requirement set.

The number of times that runperf exercises the test code to warm it up has increased from four to five. This change results in typically fewer samples required to meet the objective relative margin of error.

If your code relies on the previous value, you might need to update your code. For example, if you use warmupTable = results(1).TestActivity(1:4,:) to create a table of warm-up measurements, replace 4 with5.

To specify whether the framework issues a warning or throws an error when it encounters an invalid test file in a folder or namespace, use theInvalidFileFoundAction name-value argument.

When you assign a nonempty cell array to a parameterization property, the testing framework generates parameter names from the elements of the cell array by taking into account their values, types, and dimensions. In previous releases, if the property value is a cell array of character vectors, the framework generates parameter names from the values in the cell array. Otherwise, the framework specifies parameter names as value1, value2, …, valueN.

If your code uses parameter names to create or filter test suites, replace the old parameter names with the descriptive parameter names. For example, update suite = testsuite(pwd,"ParameterName","value1") by replacing value1 with a descriptive parameter name.

The IncludeSubfolders name-value argument treats folders and namespaces the same way. For example,runperf(pwd,IncludeSubfolders=true) runs all the tests in the current folder and any of its subfolders, including namespace folders. In previous releases, IncludeSubfolders ignores namespace folders.

The runperf function ignores any files in a MATLAB project that do not define test procedures. For example, if an abstract TestCase class definition file is labeled with theTest classification, the function ignores it. In previous releases, MATLAB produces an error if runperf is called on a project that uses the Test classification for any files other than concrete test files.

If MATLAB runs without the Java® Virtual Machine (JVM®) software, runperf cannot run the tests in a MATLAB project. The reason is that the project cannot be opened without the JVM software. In previous releases, when MATLAB runs without the JVM software, runperf creates a suite from the test files in the project and runs the suite.

When your current folder is a project root folder or when you pass the path to a project root folder to the runperf function, the function runs all test files contained in the specified project that are labeled with theTest classification.

To run the tests from referenced projects, use theIncludeReferencedProjects name-value argument.

The runperf function returns amatlab.perftest.TimeResult array containing the results of the specified performance tests. In previous releases, the function returns an array ofmatlab.unittest.measurement.MeasurementResult objects.

The default maximum number of sample measurements thatrunperf makes when running performance measurements has increased from 32 to 256.