Contenuto principale

Edit Test Plan Definition

Work With Test Plans

If you click Fit models or Design experiment in the Common Tasks pane, the dialog box guides you through the steps for setting up inputs and models. Follow the steps in Fit Models to Data or Set Up Design Inputs. If you follow those steps, then you do not need to set up anything using the test plan diagram. However, you can later edit settings from the test plan.

  • After you fit models, the view at the test plan node displays the Response Models tab. View the cross-section plots of all your response models. See Assess High-Level Model Trends.

  • If you want to edit the test plan settings, click the Test Plan tab to switch back to the test plan diagram.

After you create a test plan, you can use the test plan diagram view to:

  • Edit model inputs.

  • Edit local, global, and response model types.

  • Add new response models.

  • View and edit designs, and create designs at the local level.

  • View and refine boundary models.

  • Choose summary statistics.

  • Select new data for modeling.

When you select a test plan node (with the icon ) in the model tree (and the Test Plan tab if you already fit models), then this view appears.

Model Selection window showing four plots of response curves versus input factors with confidence intervals and vertical orange lines marking selected points. Left panel lists input factors N, L, ICP, and ECP with tolerance values and options for confidence level and boundary constraints. Bottom panel displays model list with RBF-multiquadic models and their statistics.

This example is a two-stage model. All test plan nodes show this view with a block diagram of the test plan. The diagram provides a graphical interface so you can set up inputs and set up models by double-clicking the blocks in the test plan diagram. You can also use the Test Plan menu.

Use the diagram to edit the test plan settings. Select a model block to choose the stage of the model hierarchy to use with the following menu choices:

  • Set Up Model

  • Design Experiment

  • View Design Data

  • View Model

  • Summary Statistics

The selected Model block is highlighted in yellow if a Setup dialog box is open; otherwise it is indicated by blocks at the corners.

The following sections describe how to set up models, designs and data from your test plan.

Edit Model Inputs

Edit the number and definition of model input factors for each stage by double-clicking or right-clicking the inports of the test plan block diagram. You can update ranges and symbols and refit existing models. Setting ranges can be important before you design experiments.

The following example shows the input setup dialog box for the global model. The dialog box for the local model contains exactly the same controls.

Dialog box titled ‘Global Input Factor Setup’ showing settings for one factor with symbol G1, minimum 0, maximum 100, transform set to None, and an empty signal field.

You can use the following controls:

  • Number of Factors

    You can change the number of input factors using the buttons at the top.

  • Symbol

    The input symbol is used as a shortened version of the signal name throughout the application. The symbol should contain a maximum of three characters.

  • Min and Max Model Range

    This setting is important before you design experiments. The default range is [0.100]. There is usually some knowledge about realistic ranges for variables. If you are not designing an experiment you can use the data range as the model range later, in the data selection stage. In some cases you might not want to use the data range (for example, if the data covers too wide a range, or not wide enough) if you are interested in modeling a particular region. In that case you can set the range of interest here.

  • Transform

    You can use input transformations to change the input factor for designing experiments. The available input transformations are 1/x, sqrt(x), log10(x), x^2, log(x).

  • Signal

    You can set up the signal name in the input factor setup dialog box. It is not necessary to set this parameter at this stage, as it can be defined later at the data selection stage (as with the range). However, setting the signal name in this dialog box simplifies the data selection procedures, as the Model Browser looks for matching signal names in loaded data sets. When the number of data set variables is large this can save time.

Edit Local, Global, and Response Models

Set up models by double-clicking the model blocks in the test plan diagram. Select model types to set up the new default models for each stage in the model hierarchy.

The block diagram in the test plan view represents the hierarchical structure of models. Following is an example of a two-stage test plan block diagram.

Diagram showing a modeling workflow with local and global inputs feeding into quadratic models. Local input labeled ‘1’ connects to Local Model, which outputs to Responses labeled ‘1’. Global input labeled ‘2’ connects to Global Model, which also links to Local Model.

See Explore Local Model Types and Explore Global Model Types for information on all model options.

After you set up model types, you can design an experiment, or select data for fitting.

To choose data for fitting, double-click the Responses block in the Test Plan diagram to open the Data Wizard. For the same result, you could also click the Select Data toolbar button (or TestPlan > Select Data menu item).

When you first set up a test plan, the Data Wizard guides you through response model setup after the data matching functions.

To add a new response model to an existing test plan, double-click the Responses outport (or click the New button at test plan level ). See Add Response Models and Datum Models.

Design Experiments

Note

If you use the Design Experiment common task workflow, you create designs for global inputs only. If you want to create designs at the local level, for example for point-by-point modeling, then you must open the Design Editor from the local block in the test plan diagram.

You can access the Design Editor from the test plan via the context menus on the model blocks, or the TestPlan menu (for a particular model—you must select a model or input block before you can design an experiment). View Design Data also opens the Design Editor where you can investigate the statistical design properties of the data.

When the test plan already has a design, the design name is displayed.

You can design experiments for both stages, local and global. You open the Design Editor in several ways from the test plan level:

  • Right-click a Model block in the test plan diagram and select Design Experiment.

    Click a stage to design for (first or second stage) to enable the following two options:

  • Click the Design Experiment toolbar button .

  • Select TestPlan > Design Experiment.

For an existing design, View > Design Data also launches the Design Editor (also in the context menu on each Model block). In this case you can only view the current data being used as a design at this stage. If you enter the Design Editor by the other routes, you can view all alternative designs for that stage.

See Design of Experiments.

Viewing Designs

The view design facility enables the user to investigate the statistical properties of the current data.

From the test plan node, select the model stage you are interested in by clicking, then choose View > Design Data. Alternatively, use the context menu on a Model block.

This provides access to all the Design Editor and design evaluation utility functions with the current data rather than the pre-specified design. If you have done some data-matching to a design, each data point is used as a design point. You can now investigate the statistical properties of this design.

For two-stage models, viewing stage one (local model) designs creates a separate design for each test.

See Design Experiments or the step-by-step guide in Design of Experiments in the Getting Started documentation.

Select New Data

To load new data, select Test Plan > Fit Models. See Import and Merge Data in MBC Model Fitting App.

To attach data to the test plan, double-click the Responses block in the test plan diagram to open the Data Wizard (if the project already has data loaded). Alternatively, use TestPlan > Select Data or the toolbar button . If no data is selected, this button opens the Data Wizard, and if a data set is already selected for the test plan, it takes you straight to the Data Selection views in the Data Editor.

In the Data Editor you can select data for modeling and match data to a design. For example, after the design changes, new data matching might be necessary. See Match Data to Designs for details.

If a test plan already has data attached to it, details of the data set (such as name, number of records) are displayed in the right pane.

You can attach validation data to your test plan using the TestPlan menu. You can use validation data with any model except response features. When you attach validation data to your test plan, Validation RMSE is automatically added to the summary statistics for comparison in the bottom list view of response models in the test plan. See Using Validation Data.

If the test plan already has validation data attached to it, the name is displayed in the right pane.

Choose Summary Statistics

Right-click the global model block in the test plan diagram and select Summary Statistics to reach the Summary Statistics dialog box. In this dialog box you can choose which summary statistics you want displayed to help you evaluate models. See Summary Statistics.

View and Refine Boundary Models

From the test plan you can access the Boundary Constraint Modeling functionality from the toolbar or TestPlan menu. See Explore Boundary Model Types.

When the test plan already has a boundary model, the right pane displays which boundary models are combined in the best boundary model.

Save the Current Test Plan as a Template

You can save the current test plan as a template using the TestPlan > Make Template command or the toolbar button . This capability can be useful for speeding up creation of subsequent projects. See Create and Reuse Test Plan Templates.

Automate Model Fits with MATLAB Function

You can generate a MATLAB® function that creates a new test plan from an existing test plan. The test plan contains the same data pre-processing rules, model types, and boundary models that are in the original test plan. Use the function to fit a model with new data.

To generate the function:

  1. Select the model node.

  2. Select TestPlan > Generate Code.

  3. Name and save the function.

For example, to create a function that fits the gasolineOneStageModels models with new data, follow these steps:

  1. In the Model Browser, select File > Open Project. Navigate to <matlabroot>/toolbox/mbc/mbctraining. Open the gasolineOneStage.mat project.

  2. Select the gasolineOneStageModels model node. Select TestPlan > Generate Code.

  3. Navigate to your working folder. Save the MATLAB function as gasolineOneStageModels.m.

    The MATLAB editor opens. The gasolineOneStageModels.m function creates a test plan with the same data pre-processing rules, model types, and boundary models that are in the original test plan.

    function T = gasolineOneStageModels(Project,Data)
    %gasolineOneStageModels MBC test plan function
    %    T = gasolineOneStageModels(Project,Data);
    %    Requires test plan template gasolineOneStageModels.mbt.
    %    Data can be a file name or a table object.
    %
    %    Auto-generated from gasolineOneStage/gasolineOneStageModels in Model-Based Calibration toolbox version 5.5(R2019a).
    
    narginchk(2,2)
    assert(isa(Project,'mbcmodel.project'),'An mbcmodel.project object is required.')
    
    %Import data into MBC project
    D = CreateData(Project,Data);
    BeginEdit(D);
    %Variables
    %Filters
    AddFilter(D,' KIT1<2');
    AddFilter(D,' RF1<25');
    AddFilter(D,' TSPEED<200000');
    AddFilter(D,' TEXH<860');
    AddFilter(D,' SIMTIME<249');
    AddFilter(D,' LOAD<2');
    AddFilter(D,' SA>1 & SA<50');
    CommitEdit(D);
    
    %Create test plan and attach data
    T = CreateTestplan(Project,'gasolineOneStageModels.mbt');
    AttachData(T,D,'UseDataRange',true,'Boundary',false);
    %Create boundary models
    mdl = CreateBoundary(T.Boundary,'Convex hull');
    Add(T.Boundary,mdl);

  4. Create a new project that fits the gasolineOneStageModels models with new data. The data can be a file name or a table object.

    Project = mbcmodel.CreateProject('mynewproject.mat');
    % Create gasolineOneStageModels with new data.
    T=gasolineOneStageModels(Project,Data);

  5. Save and load the new project.

    Save(Project,'mynewproject.mat');
    mbcmodel mynewproject.mat
    

Test Plan Tools

The eight buttons on the left (project and node management, plus the Print and Help buttons) appear in every view level. The right buttons change at different levels.

In the test plan level view, the right buttons are as follows:

Test Plan Menu

  • Edit Inputs — See Edit Model Inputs.

  • Set Up Model — See Explore Local Model Types and Explore Global Model Types.

    You can also reach these functions by double-clicking the input and model blocks in the test plan diagram, and both can only be used when a Model block is first selected in the diagram. You must specify the model to set up, local or global.

  • Design Experiment — See About the Design Editor.

    This is also available in the toolbar and in the right-click context menu on the blocks in the test plan diagram.

  • Edit Boundary — Opens the Constraint Modeling window. Also available in the toolbar. See Explore Boundary Model Types.

  • Summary Statistics — Only enabled after you click the global model block in the test plan diagram. Opens the Summary Statistics where you can edit the statistics shown for the global models. See Summary Statistics.

  • Fit Models — opens the Fit Models Wizard where you can load new data. See Select Data for Modeling Using the Fit Models Wizard.

  • Edit Data — Opens the Data Editor. See View and Edit Data in the Data Editor.

  • Validation Data Opens a wizard to select data for validation. See Using Validation Data.

  • Make Template — Opens a dialog box for saving the current test plan as a new template, with or without designs and response models. Same as the toolbar button. See Create and Reuse Test Plan Templates.

  • Generate Code— Generate a MATLAB function that creates a new test plan from an existing test plan. See Automate Model Fits with MATLAB Function.

  • Export Point-by-Point Models— Only enabled if you have set up a two-stage model with the correct number of inputs. Two global inputs are required (normally speed and load). This provides an interface with the Point-by-Point Tradeoff in the CAGE browser part of Model-Based Calibration Toolbox™. This allows you to calibrate from local maps. See Edit Point-by-Point Model Types for details.

View Menu (Test Plan Level)

  • Design Data — Opens the Design Editor. The view design facility enables you to investigate the statistical properties of the collected data. This provides access to all the Design Editor and design evaluation utility functions with the current design rather than the pre-specified design (after data matching, the data points are used as the new design points). See About the Design Editor.

    For two-stage models, viewing level 1 designs creates a separate design for each test.

  • Model — Opens a dialog box showing the terms in the current model.

  • Both of these are only available when a model or input block is selected in the test plan block diagram.