You can use model metrics to assess that your model and code comply with size, complexity, and readability requirements. As you test your model against requirements, you can use metrics to assess the status and quality of your requirements-based testing activities. You can use the model metric API to create your own model metrics, compute model and testing metrics, and export metric data. To visualize model metric data and compliance status for your model, use the Metrics Dashboard. Use the Model Testing Dashboard to view metric data on the completeness of requirements, test cases, and test results for your model. To get started, see Collect and Explore Metric Data by Using the Metrics Dashboard and Explore Status and Quality of Testing Activities Using the Model Testing Dashboard.
Collect and Explore Metric Data by Using the Metrics Dashboard
Collect and view metric data for quality assessment.
Customize Metrics Dashboard Layout and Functionality
Configure compliance metrics, add metric thresholds, and customize Metrics Dashboard layout.
Collect Model Metrics Programmatically
Use the model metric API to programmatically collect metrics for a model, such as subsystem and block counts.
Model metrics provided by MathWorks that return metric data on your model for size, complexity, readability, and compliance to standards and guidelines.
Options for defining model metric data aggregation and returning aggregated model metric results.
Create a Custom Model Metric for Nonvirtual Block Count
Create a new model metric by using the
slmetric.metric.createNewMetricClass
function and
defining the metric algorithm.
Identify Modeling Clones with the Metrics Dashboard
You can use the Model Metrics Dashboard tool to enable subsystem reuse by identifying exact graphical clones across a model hierarchy.
Collect Model Metrics Using the Model Advisor
Assess model size, complexity, and readability, using the Model Advisor to run model metric checks.
Collect Compliance Data and Explore Results in the Model Advisor
This example shows how to collect model metric data by using the Metrics Dashboard and explore detailed compliance results and, fix compliance issues by using the Model Advisor.
Collect Metric Data Programmatically and View Data Through the Metrics Dashboard
This example shows how to use the model metrics API to collect model metric data for your model, and then explore the results by using the Metrics Dashboard.
Fix Metric Threshold Violations in a Continuous Integration Systems Workflow
Use a continuous integration workflow to investigate whether your model violates metric threshold values.
Compare Model Complexity and Code Complexity Metrics
Assess the complexity of your system in model-based design.
Explore Status and Quality of Testing Activities Using the Model Testing Dashboard
Evaluate the status and quality of model testing in your project.
Fix Requirements-Based Testing Issues
Fix model testing quality issues by using the Model Testing Dashboard.
Model testing metrics provided by MathWorks that return metric data on your model for implementing and testing requirements.
Manage Requirements-Based Testing Artifacts for Analysis in the Model Testing Dashboard
Set up and manage a project that uses the Model Testing Dashboard.
Collect Metrics on Model Testing Artifacts Programmatically
Use a script to assess the quality of your requirements-based testing.
Assess the Completeness of Requirements-Based Testing in Accordance with ISO 26262
Use the Model Testing Dashboard to analyze the completeness and quality of requirements-based testing activities in accordance the ISO 26262 standard.