When you develop and test software units using Model-Based Design, use the Model Testing Dashboard to assess the status and quality of your unit model testing activities. Requirements-based testing is a central element of model verification. By establishing traceability links between your requirements, model design elements, and test cases, you can measure the extent to which the requirements are implemented and verified. The Model Testing Dashboard analyzes this traceability information and provides detailed metric measurements on the traceability, status, and results of these testing artifacts.
Each metric in the dashboard measures a different aspect of the quality of your unit testing and reflects guidelines in industry-recognized software development standards, such as ISO 26262 and DO-178C. To monitor the requirements-based testing quality of your models in the Model Testing Dashboard, maintain your artifacts in a project and follow these considerations. For more information on using the Model Testing Dashboard, see Explore Status and Quality of Testing Activities Using the Model Testing Dashboard.
To analyze your requirements-based testing activities in the Model Testing Dashboard, store your design and testing artifacts in a project. The artifacts that the testing metrics analyze include:
Requirements that you create in Simulink® Requirements™
Libraries that the models use
Test cases that you create in Simulink Test™
Test results from the executed test cases
When your project contains many models and model reference hierarchies, you can more easily track your unit testing activities by configuring the dashboard to recognize the different testing levels of your models. Specify which models are units and which models are higher-level components. The dashboard organizes your models in the Artifacts pane according to their testing levels and the model reference hierarchy. For more information, see Categorize Models in a Hierarchy as Components or Units.
As you modify your models and testing artifacts, update the dashboard to reflect the latest changes. To analyze the latest assets in the Model Testing Dashboard, check that you:
Save the changes to your artifact files.
Export test results and save them in a results file.
Store the files that you want to analyze in the project.
To determine which artifacts are in the scope of a unit, the Model Testing Dashboard analyzes the traceability links between the artifacts and the software unit models in the project. The Artifacts pane lists the unit models, represented by the model names, organized by the components that reference them. Under each unit, the pane shows these artifacts that trace to the unit:
To see the traceability path that the dashboard found from an artifact to its unit,
right-click the artifact and click View trace to unit. A traceability
graph opens in a new tab in the Model Testing Dashboard. The graph shows the connections and
intermediate artifacts that the dashboard traced from the unit to the artifact. To see the
type of traceability that connects two artifacts, place your cursor over the arrow that
connects the artifacts. The traceability relationship is either one artifact containing the
other or one artifact tracing to the other. For example, the trace view for the functional
CC003_05 shows that it is contained in the requirement
Activating cruise control. The container requirement traces to the
Set Switch Detection, which traces to the unit
After the list of models, the Untraced folder shows artifacts that the dashboard has not traced to models. If an artifact returns an error during traceability analysis, the panel includes the artifact in the Errors folder. Use the traceability information in these sections and in the units to check if the testing artifacts trace to the models that you expect. To see details about the warnings and errors that the dashboard finds during artifact analysis, at the bottom of the Model Testing Dashboard dialog, click Diagnostics.
As you edit and save the artifacts in your project, the dashboard tracks your changes and indicates if the traceability data in the Artifacts panel might be stale by enabling the Trace Artifacts button. To update the traceability data, click Trace Artifacts. If the button is not enabled, the dashboard has not detected changes that affect the traceability information.
The folder Functional Requirements shows requirements where the
Type is set to
Functional and that trace to the
unit model directly or through a container requirement, a library subsystem, or a
combination of the two. For more information about linking requirements, see Requirement Links (Simulink Requirements).
If a requirement does not trace to a unit, it appears in the Untraced Artifacts folder. If a requirement does not appear in the Artifacts panel when you expect it to, see Requirement Missing from Artifacts Pane.
When you collect metric results for a unit, the dashboard analyzes a subset of the
requirements that appear in the Functional Requirements folder. The
metrics analyze only requirements where the Type is set to
Functional and that are
directly linked to the model with a link where the
Type is set to
Implements. A requirement
that traces to the unit but does not have these settings appears in the
Functional Requirements folder but does not contribute the
metric results for requirements. For troubleshooting metric results for requirements,
see Fix a requirement that does not produce metric results.
The folder Design shows:
The model file that contains the block diagram for the unit.
Models that the unit references.
Libraries that are partially or fully used by the model.
Data dictionaries that are linked to the model.
The folder Test Cases shows test cases that trace to the model. This includes test cases that run on the model and test cases that run on subsystems in the model by using test harnesses. Create these test cases in a test suite file by using Simulink Test.
If a test case does not trace to a unit, it appears in the Untraced Artifacts folder. If a test case does not appear in the Artifacts panel when you expect it to, see Test Case Missing from Artifacts Pane.
When you collect metric results for a unit, the dashboard analyzes a subset of the test cases that appear in the Test Cases folder. The dashboard analyzes only test cases that run on the model. Subsystem test harnesses appear in the folder but do not contribute to the metrics because they do not test the whole model. For troubleshooting test cases in metric results, see Fix a test case that does not produce metric results.
The folder Test Results shows these types of test results from test cases that test the model:
Saved test results — results that you have collected in the Test Manager and have exported to a results file.
Temporary test results — results that you have collected in the Test Manager but have not exported to a results file. When you export the results from the Test Manager the dashboard analyzes the saved results instead of the temporary results. Additionally, the dashboard stops recognizing the temporary results when you close the project or close the result set in the Simulink Test Result Explorer. If you want to analyze the results in a subsequent test session or project session, export the results to a results file.
If a test result does not trace to a unit, it appears in the Untraced Artifacts folder. If a test result does not appear in the Artifacts panel when you expect it to, see Test Result Missing from Artifacts Pane.
When you collect metric results for a unit, the dashboard analyzes a subset of the test results that appear in the Test Results folder. For troubleshooting test results in dashboard metric results, see Fix a test result that does not produce metric results.
The folder Untraced shows artifacts that the dashboard has not traced to models. Use the Untraced folder to check if artifacts are missing traceability to the units. When you add traceability to an artifact, update the information in the panel by clicking Trace Artifacts. The Model Testing Dashboard does not support traceability analysis for some artifacts and some links. If an artifact is untraced when you expect it to trace to a unit, see the troubleshooting solutions in Untraced Artifacts.
The folder Errors shows artifacts that returned errors when the dashboard performed artifact analysis. These are some errors that artifacts might return during traceability analysis:
An artifact returns an error if it has unsaved changes when traceability analysis starts.
A test results file returns an error if it was saved in a previous version of Simulink.
A model returns an error if it is not on the search path.
Open these artifacts and fix the errors. Then, to analyze the traceability in the dashboard, click Trace Artifacts.
To see details about artifacts that cause warnings,errors, and information messages during analysis, at the bottom of the Model Testing Dashboard dialog, click Diagnostics. You can filter the diagnostic messages by type and clear the messages from the viewer.
The diagnostic messages show:
Modeling constructs that the dashboard does not support
Links that the dashboard does not trace
Test harnesses or cases that the dashboard does not support
Test results missing coverage or simulation results
Artifacts that return errors when the dashboard loads them
Information about model callbacks that the dashboard deactivates
Files that have file shadowing or path traceability issues
Artifacts that are not on the path and are not considered during tracing
The Model Testing Dashboard collects metric results for each unit listed in the Artifacts pane. Each metric in the dashboard measures a different aspect of the quality of the testing of your model and reflects guidelines in industry-recognized software development standards, such as ISO 26262 and DO-178. For more information about the available metrics and the results that they return, see Model Testing Metrics.
As you edit and save the artifacts in your project, the dashboard tracks your changes and indicates if the metric results in the dashboard might be stale. If your changes affect the traceability information in the Artifacts panel, click Trace Artifacts. After you update the traceability information, if the metric results might be affected by your artifact changes, the Stale Metrics icon appears at the top of the dashboard. Affected widgets appear highlighted in gray. To update the results, click Collect Results > Collect All Results.
The dashboard does not indicate stale metric data for these changes:
After you run a test case and analyze the results in the dashboard, if you make changes to the test case, the dashboard indicates that test case metrics are stale but does not indicate that the results metrics are stale.
When you change a coverage filter file that your test results use, the coverage metrics in the dashboard do not indicate stale data or include the changes. After you save the changes to the filter file, re-run the tests and use the filter file for the new results.