Test Performance Using Classes - MATLAB & Simulink (original) (raw)

Main Content

This example shows how to create and run a class-based performance test and regression test for the fprintf function.

Write Performance Test

Consider the following unit (regression) test. You can run this test as a performance test using runperf("fprintfTest") instead ofruntests("fprintfTest").

classdef fprintfTest < matlab.unittest.TestCase properties file fid end methods(TestMethodSetup) function openFile(testCase) testCase.file = tempname; testCase.fid = fopen(testCase.file,'w'); testCase.assertNotEqual(testCase.fid,-1,'IO Problem')

        testCase.addTeardown(@delete,testCase.file);
        testCase.addTeardown(@fclose,testCase.fid);
    end
end

methods(Test)
    function testPrintingToFile(testCase)
        textToWrite = repmat('abcdef',1,5000000);
        fprintf(testCase.fid,'%s',textToWrite);
        testCase.verifyEqual(fileread(testCase.file),textToWrite)
    end
    
    function testBytesToFile(testCase)
        textToWrite = repmat('tests_',1,5000000);
        nbytes = fprintf(testCase.fid,'%s',textToWrite);
        testCase.verifyEqual(nbytes,length(textToWrite))
    end
end

end

The measured time does not include the time to open and close the file or the assertion because these activities take place inside aTestMethodSetup block, and not inside aTest block. However, the measured time includes the time to perform the verifications. Best practice is to measure a more accurate performance boundary.

Create a performance test in a file named fprintfTest.m in your current folder. This test is similar to the regression test with the following modifications:

classdef fprintfTest < matlab.perftest.TestCase properties file fid end methods(TestMethodSetup) function openFile(testCase) testCase.file = tempname; testCase.fid = fopen(testCase.file,'w'); testCase.assertNotEqual(testCase.fid,-1,'IO Problem')

        testCase.addTeardown(@delete,testCase.file);
        testCase.addTeardown(@fclose,testCase.fid);
    end
end

methods(Test)
    function testPrintingToFile(testCase)
        textToWrite = repmat('abcdef',1,5000000);
        
        testCase.startMeasuring();
        fprintf(testCase.fid,'%s',textToWrite);
        testCase.stopMeasuring();
        
        testCase.verifyEqual(fileread(testCase.file),textToWrite)
    end
    
    function testBytesToFile(testCase)
        textToWrite = repmat('tests_',1,5000000);
        
        testCase.startMeasuring();
        nbytes = fprintf(testCase.fid,'%s',textToWrite);
        testCase.stopMeasuring();
        
        testCase.verifyEqual(nbytes,length(textToWrite))
    end
end

end

The measured time for this performance test includes only the call tofprintf, and the testing framework still evaluates the qualifications.

Run Performance Test

Run the performance test. Depending on your system, you might see warnings that the performance testing framework ran the test the maximum number of times but did not achieve a 0.05 relative margin of error within a0.95 confidence level.

results = runperf("fprintfTest")

Running fprintfTest .......... .......... ... Done fprintfTest


results =

1×2 TimeResult array with properties:

Name
Valid
Samples
TestActivity

Totals: 2 Valid, 0 Invalid. 3.6789 seconds testing time.

The results variable is a 1-by-2 TimeResult array. Each element in the array corresponds to one of the tests defined in the test file.

Display Test Results

Display the measurement results for the first test. Your results might vary.

ans =

TimeResult with properties:

        Name: 'fprintfTest/testPrintingToFile'
       Valid: 1
     Samples: [4×7 table]
TestActivity: [9×12 table]

Totals: 1 Valid, 0 Invalid. 2.7009 seconds testing time.

As indicated by the size of the TestActivity property, the performance testing framework collected nine measurements. This number includes five measurements to warm up the code. The Samples property excludes warm-up measurements.

Display the sample measurements for the first test.

ans =

4×7 table

             Name                 MeasuredTime         Timestamp             Host        Platform                 Version                             RunIdentifier            
______________________________    ____________    ____________________    ___________    ________    __________________________________    ____________________________________

fprintfTest/testPrintingToFile       0.04193      14-Oct-2022 14:25:02    MY-HOSTNAME     win64      9.14.0.2081372 (R2023a) Prerelease    b8029663-5fbd-44e8-a5a6-06564ab18df6
fprintfTest/testPrintingToFile       0.04148      14-Oct-2022 14:25:02    MY-HOSTNAME     win64      9.14.0.2081372 (R2023a) Prerelease    b8029663-5fbd-44e8-a5a6-06564ab18df6
fprintfTest/testPrintingToFile      0.041849      14-Oct-2022 14:25:03    MY-HOSTNAME     win64      9.14.0.2081372 (R2023a) Prerelease    b8029663-5fbd-44e8-a5a6-06564ab18df6
fprintfTest/testPrintingToFile      0.041969      14-Oct-2022 14:25:03    MY-HOSTNAME     win64      9.14.0.2081372 (R2023a) Prerelease    b8029663-5fbd-44e8-a5a6-06564ab18df6

Compute Statistics for Single Test Element

Display the mean measured time for the first test. To exclude data collected in the warm-up runs, use the values in the Samples property.

sampleTimes = results(1).Samples.MeasuredTime; meanTest = mean(sampleTimes)

Compute Statistics for All Test Elements

To compare the different calls to fprintf, create a table of summary statistics from results. In this example, both test methods write the same amount of data to a file. Therefore, some of the difference between the statistical values is attributed to calling thefprintf function with an output argument.

T = sampleSummary(results)

T =

2×7 table

             Name                 SampleSize      Mean      StandardDeviation      Min       Median       Max   
______________________________    __________    ________    _________________    ________    _______    ________

fprintfTest/testPrintingToFile        4         0.041807       0.00022367         0.04148    0.04189    0.041969
fprintfTest/testBytesToFile           9         0.044071         0.003268        0.041672    0.04223    0.049611

Change Statistical Objectives and Rerun Tests

Change the statistical objectives defined by the runperf function by constructing and running a time experiment. Construct a time experiment with measurements that reach a sample mean with a2% relative margin of error within a98% confidence level. Collect 4 warm-up measurements and up to 16 sample measurements.

Create a test suite.

suite = testsuite("fprintfTest");

Construct a time experiment with the specified requirements, and run the tests. In this example, the performance testing framework is not able to meet the stricter statistical objectives with the specified number of maximum samples. Your results might vary.

import matlab.perftest.TimeExperiment experiment = TimeExperiment.limitingSamplingError("NumWarmups",4, ... "MaxSamples",16,"RelativeMarginOfError",0.02,"ConfidenceLevel",0.98); resultsTE = run(experiment,suite);

Running fprintfTest .......... .......... ........Warning: Target Relative Margin of Error not met after running the MaxSamples for fprintfTest/testBytesToFile.

Done fprintfTest


Increase the maximum number of samples to 32 and rerun the time experiment.

experiment = TimeExperiment.limitingSamplingError("NumWarmups",4, ... "MaxSamples",32,"RelativeMarginOfError",0.02,"ConfidenceLevel",0.98); resultsTE = run(experiment,suite);

Running fprintfTest .......... .......... .......... . Done fprintfTest


Compute the summary statistics for the test elements.

T1 = sampleSummary(resultsTE)

T1 =

2×7 table

             Name                 SampleSize      Mean      StandardDeviation      Min        Median       Max   
______________________________    __________    ________    _________________    ________    ________    ________

fprintfTest/testPrintingToFile         4        0.041632       4.2448e-05        0.041578    0.041638    0.041674
fprintfTest/testBytesToFile           19        0.042147        0.0016461        0.041428    0.041705    0.048784

Measure First-Time Cost

Start a new MATLAB® session. A new session ensures that MATLAB has not run the code contained in your tests.

Measure the first-time cost of your code by creating and running a fixed time experiment with zero warm-up measurements and one sample measurement.

Create a test suite. Because you are measuring the first-time cost of a function, run a single test. To run multiple tests, save the results and start a new MATLAB session between tests.

suite = testsuite("fprintfTest/testPrintingToFile");

Construct and run the time experiment.

import matlab.perftest.TimeExperiment experiment = TimeExperiment.withFixedSampleSize(1); results = run(experiment,suite);

Running fprintfTest . Done fprintfTest


Display the results. The TestActivity table shows that there were no warm-up measurements.

fullTable = results.TestActivity

fullTable =

1×12 table

             Name                 Passed    Failed    Incomplete    MeasuredTime    Objective         Timestamp             Host        Platform                 Version                            TestResult                         RunIdentifier            
______________________________    ______    ______    __________    ____________    _________    ____________________    ___________    ________    __________________________________    ______________________________    ____________________________________

fprintfTest/testPrintingToFile    true      false       false         0.044004       sample      14-Oct-2022 14:32:51    MY-HOSTNAME     win64      9.14.0.2078117 (R2023a) Prerelease    1×1 matlab.unittest.TestResult    be5b0bfd-9b87-4498-9ef3-675c6889a85c

See Also

runperf | testsuite | matlab.perftest.TimeExperiment | matlab.perftest.TestCase | matlab.perftest.TimeResult

Topics