## Tuning Fuzzy Inference Systems

Designing a complex fuzzy inference system (FIS) with large number of inputs and membership functions (MFs) is a challenging problem due to large number of MF parameters and increasing number of rules. A promising solution to this problem is to design FIS trees, which include hierarchically connected modular FISs having small number of inputs. Designing a FIS tree with appropriate MF parameters and rules, however, is a difficult task when adequate expert knowledge is not available for a complex system. Hence, the data-driven approach of tuning and optimizing MF and rule parameters offers an elegant solution to this problem.

You can tune a fuzzy system using the `tunefis` command. The following figure shows how a fuzzy system is tuned with input/output training data. The tuning process uses an optimization method on an optimization problem created from a fuzzy system.

In each iteration, the optimization method generates multiple sets of solutions, which are values for the selected parameters of the fuzzy system. The fuzzy system is updated with each solution and then evaluated using the input training data. The evaluated output is compared with the output training data to generate costs of the solutions. This process continues for multiple iterations until the stop condition is met, and then it returns the minimum cost solution with the optimized fuzzy system parameters. For an example that uses this approach, see Tune Mamdani Fuzzy Inference System.

If input/output training data is not available, you can use a custom model in place of the training data to evaluate a fuzzy system for cost measurement. This approach is shown in the following figure.

In this case, the custom model uses the fuzzy system to minimize the cost of achieving specific performance goals. The parameter solution that produces the best performance of the custom model is returned as the optimization result. For example, in a robot navigation model, the performance goal is to minimize the travelled distances to target positions without colliding with any of the obstacles. Hence, the navigation model uses the fuzzy system to control the robot’s heading direction to achieve the goal. For an example that uses this approach, see Tune Fuzzy Systems using Custom Cost Function.

You can select individual MF and rule parameters of a fuzzy system for optimization.

### Tune Membership Function Parameters

This example shows how to specify parameter settings for each input and output MF of a FIS and tune it.

Create a FIS.

```fis = mamfis; fis = addInput(fis,[0 10],'NumMFs',3); fis = addOutput(fis,[0 1],'NumMFs',3); fis = addRule(fis,[1 1 1 1;1 1 1 1;1 1 1 1]);```

Extract input and output parameter settings from the FIS.

`[in,out] = getTunableSettings(fis)`
```in = VariableSettings with properties: Type: "input" VariableName: "input1" MembershipFunctions: [1×3 fuzzy.tuning.MembershipFunctionSettings] FISName: "fis" ```
```out = VariableSettings with properties: Type: "output" VariableName: "output1" MembershipFunctions: [1×3 fuzzy.tuning.MembershipFunctionSettings] FISName: "fis" ```

The parameter settings are represented by `VariableSettings` objects that include the FIS name, variable type, variable name, and MF parameter settings. Examine the parameter settings of MF 1 of input 1.

`in(1).MembershipFunctions(1).Parameters`
```ans = NumericParameters with properties: Minimum: [-Inf -Inf -Inf] Maximum: [Inf Inf Inf] Free: [1 1 1] ```

For each parameter value of an input/output MF, you can specify whether it is available for tuning and its minimum and maximum values. By default, all MF parameters are free for tuning and their ranges are set to [-`Inf,Inf]. Make` MF 1 of input 1 nontunable.

`in(1).MembershipFunctions(1) = setTunable(in(1).MembershipFunctions(1),false);`

Similarly, make the first parameter of MF 2 of input 1 nontunable.

`in(1).MembershipFunctions(2).Parameters.Free(1) = false;`

Set minimum ranges for second and third parameters of MF 3 of input 1 to 0.

`in(1).MembershipFunctions(3).Parameters.Minimum(2:3) = 0;`

Set maximum ranges for second and third parameters of MF 3 of input 1 to 15.

`in(1).MembershipFunctions(3).Parameters.Maximum(2:3) = 15;`

Note that the default minimum and maximum range values of tunable MF parameters are set to corresponding input/output ranges in the tuning process.

Finally, make the output nontunable.

`out = setTunable(out,false);`

Now that you have configured the parameters, specify input and output training data. Generate some data for this example.

```x = (0:0.1:10)'; y = abs(sin(2*x)./exp(x/5));```

Specify options for `tunefis`. Use genetic algorithm for optimization.

`options = tunefisOptions("Method","ga");`

Specify maximum 5 generations for optimization.

`options.MethodOptions.MaxGenerations = 5;`

If you have Parallel Computing Toolbox™ software, you can improve the speed of the tuning process by setting `options.UseParallel` to `true`. If you do not have Parallel Computing Toolbox software, set `options.UseParallel` to `false`.

By default, `tunefis` uses root mean squared error (RMSE) for cost calculation. You can change the cost function to `norm1` or `norm2` by setting `options.DistanceMetric`.

`options.DistanceMetric = "norm1";`

Tune `fis` using the parameter settings, training data, and tuning options.

```rng('default') % for reproducibility [fisout,optimout] = tunefis(fis,[in;out],x,y,options);```
``` Best Mean Stall Generation Func-count f(x) f(x) Generations 1 100 32.84 32.84 0 2 147 32.84 32.84 1 3 194 32.84 32.84 2 4 241 32.84 32.84 3 5 288 32.84 32.84 4 Optimization terminated: maximum number of generations exceeded. ```

`fisout` includes the updated parameter values. `optimout` provides additional outputs of the optimization method and any error message that are returned during the update process of the input fuzzy system using the optimized parameter values.

`optimout`
```optimout = struct with fields: tuningOutputs: [1×1 struct] errorMessage: [] ```
`optimout.tuningOutputs`
```ans = struct with fields: x: [5 9.1667 5.8333 10 14.1667] fval: 32.8363 exitflag: 0 output: [1×1 struct] population: [50×5 double] scores: [50×1 double] ```

You can optionally tune `fis` using either input or output parameter settings. In this example, output parameter settings are set to nontunable. Therefore, tuning `fis` with only input parameter settings produces the same results.

```rng('default') tunefis(fis,in,x,y,options);```
``` Best Mean Stall Generation Func-count f(x) f(x) Generations 1 100 32.84 32.84 0 2 147 32.84 32.84 1 3 194 32.84 32.84 2 4 241 32.84 32.84 3 5 288 32.84 32.84 4 Optimization terminated: maximum number of generations exceeded. ```

Note that the best optimization costs shown in column 3 of both tuning results are the same, which indicates that the optimization results are the same in both cases.

### Tune Fuzzy Rules

You can specify only rule parameters of a fuzzy system for tuning.

Use `getTunableSettings` to get rule parameter settings from a fuzzy system. Rule parameter settings are returned as the third output argument.

`[~,~,rule] = getTunableSettings(fis)`
```rule=3×1 object 3×1 RuleSettings array with properties: Index Antecedent Consequent FISName ```

Each rule parameter settings include FIS name, index of the rule in the FIS, and parameter settings for the rule antecedent and consequent (the rule clause).

The parameter settings for a rule clause include three options:

• Whether the input/output MF indices are available for tuning. By default, clause parameters are free for tuning.

• Whether the clause allows use of NOT logic, in other words, whether it allows negative MF indices. By default, rules do not allow NOT logic.

• Whether the clause allows absence of input/output variables, in other words, if it allows zero MF indices. By default, absence of a variable is allowed.

`rule(1).Antecedent(1)`
```ans = ClauseParameters with properties: AllowNot: 0 AllowEmpty: 1 Free: 1 ```

Allow NOT logic in the antecedent of rule 1.

`rule(1).Antecedent.AllowNot = true;`

Make the consequent of rule 1 not available for tuning.

`rule(1).Consequent.Free = 0;`

Do not allow absence of a variable in the consequent of rule 2.

`rule(2).Consequent.AllowEmpty = false;`

Set rule 3 nontunable.

`rule(3) = setTunable(rule(3),false);`

Set `options.DistanceMetric` to `norm2`.

`options.DistanceMetric = "norm2";`

Tune `fis` with the rule parameter settings.

```rng('default') % for reproducibility fisout = tunefis(fis,rule,x,y,options);```
``` Best Mean Stall Generation Func-count f(x) f(x) Generations 1 100 1.648 2.575 0 2 147 1.648 2.448 1 3 194 1.648 2.212 2 4 241 1.648 2.052 3 5 288 1.648 1.874 4 Optimization terminated: maximum number of generations exceeded. ```

Because you made rule 3 nontunable, you can exclude rule 3 when you tune `fis`.

```rng('default') % for reproducibility tunefis(fis,rule(1:2),x,y,options);```
``` Best Mean Stall Generation Func-count f(x) f(x) Generations 1 100 1.648 2.575 0 2 147 1.648 2.448 1 3 194 1.648 2.212 2 4 241 1.648 2.052 3 5 288 1.648 1.874 4 Optimization terminated: maximum number of generations exceeded. ```

Note that the best optimization costs shown in column 3 of both tuning results are the same, which indicates the optimization results are the same in both cases.

### Learn Fuzzy Rules

You can tune a fuzzy system without any rules in `tunefis`. In this case, set the `OptimizationType` option of `tunefisOptions`to `learning` so that `tunefis` learns rules for the FIS.

```fisin = fis; fisin.Rules = []; options.OptimizationType = 'learning';```

Specify maximum size of the rule base to 3. This value specifies the maximum number of rules in the tuned FIS.

`options.NumMaxRules = 3;`

Note that the size of the tuned rule base may be less than `NumMaxRules`, because `tunefis` removes duplicate rules from the tuned FIS. If you do not specify `NumMaxRules`, then `tunefis` adds the maximum number of rules determined by the combinations of input MFs. The default input MF combinations include zero MF indices, which allow absence of variables. The default combinations exclude negative MF indices, so that NOT logic is not allowed.

Set `options.DistanceMetric` to `rmse` and tune the FIS.

```options.DistanceMetric = "rmse"; rng('default') % for reproducibility fisout = tunefis(fisin,[],x,y,options);```
``` Best Mean Stall Generation Func-count f(x) f(x) Generations 1 400 0.165 0.2973 0 2 590 0.165 0.2891 1 3 780 0.165 0.2685 2 4 970 0.165 0.2548 3 5 1160 0.165 0.2378 4 Optimization terminated: maximum number of generations exceeded. ```

During the tuning process, the FIS automatically learns rules after cost optimization with the training data. Examine the tuned rules.

`fisout.Rules`
```ans = 1×3 fisrule array with properties: Description Antecedent Consequent Weight Connection Details: Description ________________________________ 1 "input1==mf3 => output1=mf1 (1)" 2 "input1==mf1 => output1=mf2 (1)" 3 "input1==mf2 => output1=mf1 (1)" ```

You can remove some of the existing rules and learn additional rules.

```fisout.Rules(2:end) = []; rng('default') % for reproducibility fisout = tunefis(fisin,[],x,y,options);```
``` Best Mean Stall Generation Func-count f(x) f(x) Generations 1 400 0.165 0.2973 0 2 590 0.165 0.2891 1 3 780 0.165 0.2685 2 4 970 0.165 0.2548 3 5 1160 0.165 0.2378 4 Optimization terminated: maximum number of generations exceeded. ```
`fisout.Rules`
```ans = 1×3 fisrule array with properties: Description Antecedent Consequent Weight Connection Details: Description ________________________________ 1 "input1==mf3 => output1=mf1 (1)" 2 "input1==mf1 => output1=mf2 (1)" 3 "input1==mf2 => output1=mf1 (1)" ```

You can also tune existing rules and learn new rules.

```fisout.Rules(2:end) = []; fisout.Rules(1).Antecedent = 1; fisout.Rules(1).Consequent = 1; [~,~,rule] = getTunableSettings(fisout); rng('default') fisout = tunefis(fisin,rule,x,y,options);```
``` Best Mean Stall Generation Func-count f(x) f(x) Generations 1 400 0.165 0.3075 0 2 590 0.165 0.2738 1 3 780 0.165 0.2545 2 4 970 0.165 0.2271 3 5 1160 0.165 0.2083 4 Optimization terminated: maximum number of generations exceeded. ```
`fisout.Rules`
```ans = 1×3 fisrule array with properties: Description Antecedent Consequent Weight Connection Details: Description ________________________________ 1 "input1==mf1 => output1=mf2 (1)" 2 "input1==mf2 => output1=mf1 (1)" 3 "input1==mf3 => output1=mf1 (1)" ```

### Tune MF and Rule Parameters

You can tune all FIS parameters together.

```[in,out,rule] = getTunableSettings(fis); options = tunefisOptions('Method','ga'); options.MethodOptions.MaxGenerations = 5; rng('default') % for reproducibility fisout = tunefis(fis,[in;out;rule],x,y,options);```
``` Best Mean Stall Generation Func-count f(x) f(x) Generations 1 400 0.1624 0.2997 0 2 590 0.1624 0.2776 1 3 780 0.1624 0.2653 2 4 970 0.1592 0.2486 0 5 1160 0.1592 0.2342 1 Optimization terminated: maximum number of generations exceeded. ```

For a large fuzzy system, tuning all FIS parameters in the same tuning process may take several iterations to obtain the expected results. Hence, you can tune parameters in two steps:

1. Tune or learn rule parameters only.

2. Tune both MF and rule parameters.

The first step is less computationally expensive due to the small number of rule parameters. It quickly converges to a fuzzy rule base during training. In the second step, using the rule base from the first step as an initial condition provides fast convergence of the parameter tuning process.

### Tune FIS Tree Parameters

You can tune the parameters of a FIS tree using a similar two-step process to the one described above for tuning a FIS.

Create a FIS tree to model $\frac{\mathrm{sin}\left(\mathit{x}\right)+\mathrm{cos}\left(\mathit{x}\right)}{\mathrm{exp}\left(\mathit{x}\right)}$ as shown in the following figure.

Create `fis1` as a Sugeno type FIS, which results in faster tuning process due to computationally efficient defuzzification method. Add two inputs with range [0 10] having three MFs each. Use a smooth differentiable MF, such as `gaussmf`, to match characteristics of the data type you are modeling.

```fis1 = sugfis('Name','fis1'); fis1 = addInput(fis1,[0 10],'NumMFs',3,'MFType','gaussmf'); fis1 = addInput(fis1,[0 10],'NumMFs',3,'MFType','gaussmf');```

Add an output with range [–1.5 1.5] having nine MFs to provide maximum granularity corresponding to each combination of the input MFs. The output range is set according to the values of $\mathrm{sin}\left(\mathit{x}\right)+\mathrm{cos}\left(\mathit{x}\right)$.

`fis1 = addOutput(fis1,[-1.5 1.5],'NumMFs',9);`

Create `fis2` as a Sugeno type FIS as well. Add two inputs. Use [-1.5 1.5] as the range of the first input, which is the output of `fis1`. The second input is the same as the inputs of `fis1`, so it also uses range [0 10]. Add three MFs for each of the inputs.

```fis2 = sugfis('Name','fis2'); fis2 = addInput(fis2,[-1.5 1.5],'NumMFs',3,'MFType','gaussmf'); fis2 = addInput(fis2,[0 10],'NumMFs',3,'MFType','gaussmf');```

Add an output with range [0 1] having nine MFs to provide maximum granularity corresponding to each combination of the input MFs. The output range is set according to the values of $\frac{\mathrm{sin}\left(\mathit{x}\right)+\mathrm{cos}\left(\mathit{x}\right)}{\mathrm{exp}\left(\mathit{x}\right)}$.

`fis2 = addOutput(fis2,[0 1],'NumMFs',9);`

Connect the inputs and the outputs as shown in the diagram. Output 1 of `fis1` connects to input 1 of `fis2`, inputs 1 and 2 of `fis1` connect to each other, and input 2 of `fis1` connects to input 2 of `fis2`.

```con1 = ["fis1/output1" "fis2/input1"]; con2 = ["fis1/input1" "fis1/input2"]; con3 = ["fis1/input2" "fis2/input2"];```

Finally, create a FIS tree using `fis1`, `fis2`, `con1`, `con2`, and `con3`.

`fisT = fistree([fis1 fis2],[con1;con2;con3]);`

Specify an additional output to the FIS tree to access the output of `fis1`.

`fisT.Outputs = ["fis1/output1";fisT.Outputs];`

Generate input and output training data.

```x = (0:0.1:10)'; y1 = sin(x)+cos(x); y2 = y1./exp(x); y = [y1 y2];```

Tune the FIS tree parameters in two steps. First, use a global optimization method such as particle swarm or genetic algorithm to learn the rules of the FIS tree. Create `tunefis` options for learning with the `particleswarm` method.

`options = tunefisOptions('Method','particleswarm','OptimizationType','learning');`

This tuning step uses small number of iterations to learn a rule base without overfitting the training data. The rule base provides an educated initial condition for the second step to optimize all the FIS tree parameters together. Set the maximum iteration number to 5, and learn the rule base.

```options.MethodOptions.MaxIterations = 5; rng('default') % for reproducibility fisTout1 = tunefis(fisT,[],x,y,options);```
``` Best Mean Stall Iteration f-count f(x) f(x) Iterations 0 100 0.6682 0.9395 0 1 200 0.6682 1.023 0 2 300 0.6652 0.9308 0 3 400 0.6259 0.958 0 4 500 0.6259 0.918 1 5 600 0.5969 0.9179 0 Optimization ended: number of iterations exceeded OPTIONS.MaxIterations. ```

To tune all the FIS tree parameters, the second step uses a local optimization method such as pattern search or simulation annealing method. Local optimization is generally faster than global optimization and can produce better results when the input fuzzy system parameters are already consistent with the training data.

Use the `patternsearch` method for optimization. Set the number of iterations to 25.

```options.Method = 'patternsearch'; options.MethodOptions.MaxIterations = 25;```

Use `getTunableSettings` to get input, output, and rule parameter settings from `fisTout1`.

`[in,out,rule] = getTunableSettings(fisTout1);`

Tune all FIS tree parameters.

```rng('default') % for reproducibility fisTout2 = tunefis(fisTout1,[in;out;rule],x,y,options);```
```Iter Func-count f(x) MeshSize Method 0 1 0.596926 1 1 3 0.551284 2 Successful Poll 2 13 0.548551 4 Successful Poll 3 20 0.546331 8 Successful Poll 4 33 0.527482 16 Successful Poll 5 33 0.527482 8 Refine Mesh 6 61 0.511532 16 Successful Poll 7 61 0.511532 8 Refine Mesh 8 92 0.505355 16 Successful Poll 9 92 0.505355 8 Refine Mesh 10 128 0.505355 4 Refine Mesh 11 175 0.487734 8 Successful Poll 12 212 0.487734 4 Refine Mesh 13 265 0.487734 2 Refine Mesh 14 275 0.486926 4 Successful Poll 15 328 0.486926 2 Refine Mesh 16 339 0.483683 4 Successful Poll 17 391 0.483683 2 Refine Mesh 18 410 0.442624 4 Successful Poll 19 462 0.442624 2 Refine Mesh 20 469 0.44051 4 Successful Poll 21 521 0.44051 2 Refine Mesh 22 542 0.435381 4 Successful Poll 23 594 0.435381 2 Refine Mesh 24 614 0.398872 4 Successful Poll 25 662 0.398385 8 Successful Poll 26 698 0.398385 4 Refine Mesh Maximum number of iterations exceeded: increase options.MaxIterations. ```

The optimization cost reduces from 0.59 to 0.39 in the second step.

### Tune FIS Tree with Selected Fuzzy Systems

You can tune specific fuzzy systems in a FIS tree. To get parameter settings of the specific fuzzy systems, use `getTunableSettings`. For example, after learning the rule base of the previous FIS tree, separately tune `fis1` and `fis2` parameters. First, get the parameter settings for `fis1`.

`[in,out,rule] = getTunableSettings(fisTout1,"FIS","fis1");`

Tune `fis1` parameters of the tree.

```rng('default') fisTout2 = tunefis(fisTout1,[in;out;rule],x,y,options);```
```Iter Func-count f(x) MeshSize Method 0 1 0.596926 1 1 3 0.551284 2 Successful Poll 2 18 0.510362 4 Successful Poll 3 28 0.494804 8 Successful Poll 4 56 0.494804 4 Refine Mesh 5 84 0.493422 8 Successful Poll 6 107 0.492883 16 Successful Poll 7 107 0.492883 8 Refine Mesh 8 136 0.492883 4 Refine Mesh 9 171 0.492883 2 Refine Mesh 10 178 0.491534 4 Successful Poll 11 213 0.491534 2 Refine Mesh 12 229 0.482682 4 Successful Poll 13 264 0.482682 2 Refine Mesh 14 279 0.446645 4 Successful Poll 15 313 0.446645 2 Refine Mesh 16 330 0.44657 4 Successful Poll 17 364 0.44657 2 Refine Mesh 18 384 0.446495 4 Successful Poll 19 418 0.446495 2 Refine Mesh 20 461 0.445938 4 Successful Poll 21 495 0.445938 2 Refine Mesh 22 560 0.422421 4 Successful Poll 23 594 0.422421 2 Refine Mesh 24 597 0.397265 4 Successful Poll 25 630 0.397265 2 Refine Mesh 26 701 0.390338 4 Successful Poll Maximum number of iterations exceeded: increase options.MaxIterations. ```

In this case, the optimization cost is improved by tuning only `fis1` parameter values.

Next, get the parameter settings for `fis2` and tune the `fis2` parameters.

```[in,out,rule] = getTunableSettings(fisTout2,"FIS","fis2"); rng('default') fisTout3 = tunefis(fisTout2,[in;out;rule],x,y,options);```
```Iter Func-count f(x) MeshSize Method 0 1 0.390338 1 1 2 0.374103 2 Successful Poll 2 5 0.373855 4 Successful Poll 3 10 0.356619 8 Successful Poll 4 33 0.356619 4 Refine Mesh 5 43 0.350715 8 Successful Poll 6 65 0.349417 16 Successful Poll 7 65 0.349417 8 Refine Mesh 8 87 0.349417 4 Refine Mesh 9 91 0.349356 8 Successful Poll 10 112 0.349356 4 Refine Mesh 11 138 0.346102 8 Successful Poll 12 159 0.346102 4 Refine Mesh 13 172 0.345938 8 Successful Poll 14 193 0.345938 4 Refine Mesh 15 222 0.342721 8 Successful Poll 16 244 0.342721 4 Refine Mesh 17 275 0.342721 2 Refine Mesh 18 283 0.340727 4 Successful Poll 19 312 0.340554 8 Successful Poll 20 335 0.340554 4 Refine Mesh 21 366 0.340554 2 Refine Mesh 22 427 0.337873 4 Successful Poll 23 457 0.337873 2 Refine Mesh 24 521 0.33706 4 Successful Poll 25 551 0.33706 2 Refine Mesh 26 624 0.333193 4 Successful Poll Maximum number of iterations exceeded: increase options.MaxIterations. ```

The optimization cost is further reduced by tuning `fis2` parameter values. To avoid overfitting of individual FIS parameter values, you can further tune both `fis1` and `fis2` parameters together.

```[in,out,rule] = getTunableSettings(fisTout3); rng('default') fisTout4 = tunefis(fisTout3,[in;out;rule],x,y,options);```
```Iter Func-count f(x) MeshSize Method 0 1 0.333193 1 1 8 0.326804 2 Successful Poll 2 91 0.326432 4 Successful Poll 3 116 0.326261 8 Successful Poll 4 154 0.326261 4 Refine Mesh 5 205 0.326261 2 Refine Mesh 6 302 0.326092 4 Successful Poll 7 352 0.326092 2 Refine Mesh 8 391 0.325964 4 Successful Poll 9 441 0.325964 2 Refine Mesh 10 478 0.32578 4 Successful Poll 11 528 0.32578 2 Refine Mesh 12 562 0.325691 4 Successful Poll 13 612 0.325691 2 Refine Mesh 14 713 0.229273 4 Successful Poll 15 763 0.229273 2 Refine Mesh 16 867 0.22891 4 Successful Poll 17 917 0.22891 2 Refine Mesh 18 1036 0.228688 4 Successful Poll 19 1086 0.228688 2 Refine Mesh 20 1212 0.228688 1 Refine Mesh 21 1266 0.228445 2 Successful Poll 22 1369 0.228441 4 Successful Poll 23 1381 0.227645 8 Successful Poll 24 1407 0.226125 16 Successful Poll 25 1407 0.226125 8 Refine Mesh 26 1447 0.226125 4 Refine Mesh Maximum number of iterations exceeded: increase options.MaxIterations. ```

Overall, the optimization cost reduces from 0.59 to 0.22 in three steps.

### Tune with Custom Cost Function

Suppose you want to modify the previous FIS tree as shown in the following diagram.

Create the FIS tree.

```fis1 = sugfis('Name','fis1'); fis1 = addInput(fis1,[0 10],'NumMFs',3,'MFType','gaussmf'); fis1 = addOutput(fis1,[-1 1],'NumMFs',3); fis2 = sugfis('Name','fis2'); fis2 = addInput(fis2,[0 10],'NumMFs',3,'MFType','gaussmf'); fis2 = addOutput(fis2,[-1 1],'NumMFs',3); fis3 = sugfis('Name','fis3'); fis3 = addInput(fis3,[0 10],'NumMFs',3,'MFType','gaussmf'); fis3 = addOutput(fis3,[0 1],'NumMFs',3); con = ["fis1/input1" "fis2/input1";"fis2/input1" "fis3/input1"]; fisT = fistree([fis1 fis2 fis3],con);```

To implement the addition and multiplication operations, use a custom cost function. For this example, use the function `customcostfcn`, included at the end of the example. Learn a rule base with this cost function.

```options.Method = 'particleswarm'; options.MethodOptions.MaxIterations = 5; rng('default') fisTout1 = tunefis(fisT,[],@(fis)customcostfcn(fis,x,y),options);```
``` Best Mean Stall Iteration f-count f(x) f(x) Iterations 0 100 0.746 1.318 0 1 200 0.5787 1.236 0 2 300 0.5787 1.104 1 3 400 0.5787 1.097 0 4 500 0.5171 1.155 0 5 600 0.5171 1.067 1 Optimization ended: number of iterations exceeded OPTIONS.MaxIterations. ```

Tune all parameters of the FIS tree.

```options.Method = 'patternsearch'; options.MethodOptions.MaxIterations = 25; [in,out,rule] = getTunableSettings(fisTout1); rng('default') fisTout2 = tunefis(fisTout1,[in;out;rule],@(fis)customcostfcn(fis,x,y),options);```
```Iter Func-count f(x) MeshSize Method 0 1 0.51705 1 1 11 0.514884 2 Successful Poll 2 21 0.512873 4 Successful Poll 3 43 0.512873 8 Successful Poll 4 56 0.512873 4 Refine Mesh 5 79 0.512873 2 Refine Mesh 6 106 0.512869 4 Successful Poll 7 129 0.512869 2 Refine Mesh 8 174 0.512869 1 Refine Mesh 9 197 0.512862 2 Successful Poll 10 242 0.512862 1 Refine Mesh 11 314 0.512862 0.5 Refine Mesh 12 388 0.512862 0.25 Refine Mesh 13 422 0.510163 0.5 Successful Poll 14 429 0.509153 1 Successful Poll 15 439 0.509034 2 Successful Poll 16 460 0.509034 4 Successful Poll 17 483 0.507555 8 Successful Poll 18 495 0.507555 4 Refine Mesh 19 519 0.507555 2 Refine Mesh 20 565 0.507555 1 Refine Mesh 21 636 0.507555 2 Successful Poll 22 682 0.507555 1 Refine Mesh 23 755 0.507555 0.5 Refine Mesh 24 799 0.507554 1 Successful Poll 25 872 0.507554 0.5 Refine Mesh 26 947 0.507554 0.25 Refine Mesh Maximum number of iterations exceeded: increase options.MaxIterations. ```

You can add more input/output MFs and specify additional FIS tree outputs to improve the tuning performance. Using additional MF parameters and more training data for additional FIS tree outputs can further fine tune the outputs of `fis1`, `fis2`, and `fis3`.

### Custom Optimization Method

You can implement your own FIS parameter optimization method using `getTunableSettings`, `getTunableValues`, and `setTunableValues`. This example uses these functions to tune a rule base of a fuzzy system.

Create a FIS to approximate $\mathrm{sin}\left(\theta \right)$, where $\theta$ varies from 0 to $2\pi$.

`fisin = mamfis;`

Add an input with range [0 $2\pi$] having five MFs of Gaussian type, and an output with range [–1 1] having five MFs of Gaussian type.

```fisin = addInput(fisin,[0 2*pi],'NumMFs',5,'MFType','gaussmf'); fisin = addOutput(fisin,[-1 1],'NumMFs',5,'MFType','gaussmf');```

```fisin = addRule(fisin,[1 1 1 1;2 2 1 1;3 3 1 1;4 4 1 1;5 5 1 1]); fisin.Rules```
```ans = 1×5 fisrule array with properties: Description Antecedent Consequent Weight Connection Details: Description ________________________________ 1 "input1==mf1 => output1=mf1 (1)" 2 "input1==mf2 => output1=mf2 (1)" 3 "input1==mf3 => output1=mf3 (1)" 4 "input1==mf4 => output1=mf4 (1)" 5 "input1==mf5 => output1=mf5 (1)" ```

Set `DisableStructuralChecks` to true for faster FIS update.

`fisin.DisableStructuralChecks = true;`

Get the rule parameter settings.

`[~,~,rule] = getTunableSettings(fisin);`

Make the antecedent nontunable. In the consequent, do not allow NOT logic (negative MF indices) or empty variables (zero MF indices) in the rules.

```for i = 1:numel(rule) rule(i).Antecedent.Free = false; rule(i).Consequent.AllowNot = false; rule(i).Consequent.AllowEmpty = false; end```

Generate data for tuning.

```x = (0:0.1:2*pi)'; y = sin(x);```

To tune the rule parameters, use the `customtunefis` function defined at the end of this example. Set the number of iterations to 2, and do not allow invalid parameter values when updating the FIS using `setTunableValues`.

```numite = 2; ignoreinvp = false; fisout = customtunefis(fisin,rule,x,y,numite,ignoreinvp);```
```Initial cost = 1.170519 Iteration 1: Cost = 0.241121 Iteration 2: Cost = 0.241121 ```

Display tuned rules.

`fisout.Rules`
```ans = 1×5 fisrule array with properties: Description Antecedent Consequent Weight Connection Details: Description ________________________________ 1 "input1==mf1 => output1=mf4 (1)" 2 "input1==mf2 => output1=mf5 (1)" 3 "input1==mf3 => output1=mf3 (1)" 4 "input1==mf4 => output1=mf1 (1)" 5 "input1==mf5 => output1=mf2 (1)" ```

Allow NOT logic in the rules, and optimize the FIS again.

```for i = 1:numel(rule) rule(i).Consequent.AllowNot = true; end fisout = customtunefis(fisin,rule,x,y,numite,ignoreinvp);```
```Initial cost = 1.170519 Iteration 1: Cost = 0.357052 Iteration 2: Cost = 0.241121 ```
`fisout.Rules`
```ans = 1×5 fisrule array with properties: Description Antecedent Consequent Weight Connection Details: Description ________________________________ 1 "input1==mf1 => output1=mf4 (1)" 2 "input1==mf2 => output1=mf5 (1)" 3 "input1==mf3 => output1=mf3 (1)" 4 "input1==mf4 => output1=mf1 (1)" 5 "input1==mf5 => output1=mf2 (1)" ```

With NOT logic, there are more combinations of rule parameters, and it generally takes more iterations to tune a FIS.

Next, reset `AllowNot` to `false` and set `AllowEmpty` to `true`, in other words, allow absence of variables (zero output MF indices) in the consequent. Tune the FIS with the updated rule parameter settings.

```for i = 1:numel(rule) rule(i).Consequent.AllowNot = false; rule(i).Consequent.AllowEmpty = true; end try fisout = customtunefis(fisin,rule,x,y,numite,ignoreinvp); catch me disp("Error: "+me.message) end```
```Initial cost = 1.170519 Error: Rule consequent must have at least one nonzero membership function index. ```

The tuning process fails since the FIS only contains one output, which must be nonzero (nonempty) in the rule consequent. To ignore invalid parameter values, specify `IgnoreInvalidParameters` with `setTunableValues`.

Set `ignoreinvp` to `true`, which specifies `IgnoreInvalidParameters` value in the call to `setTunableValues` used in `customtunefis`.

```ignoreinvp = true; fisout = customtunefis(fisin,rule,x,y,numite,ignoreinvp);```
```Initial cost = 1.170519 Iteration 1: Cost = 0.241121 Iteration 2: Cost = 0.241121 ```
`fisout.Rules`
```ans = 1×5 fisrule array with properties: Description Antecedent Consequent Weight Connection Details: Description ________________________________ 1 "input1==mf1 => output1=mf4 (1)" 2 "input1==mf2 => output1=mf5 (1)" 3 "input1==mf3 => output1=mf3 (1)" 4 "input1==mf4 => output1=mf1 (1)" 5 "input1==mf5 => output1=mf2 (1)" ```

In this case, the tuning process bypasses the invalid values and uses only valid parameter values for optimization.

By default, `tunefis` ignores invalid values when updating fuzzy system parameters. You can change this behavior by setting `tunefisOptions.IgnoreInvalidParameters` to `false`.

### Generate FIS from Data and Tune

You can generate a FIS from the training data using `genfis` and then optimize the FIS with `tunefis`. In this approach, the tuning process can employ a local optimization method because the rule base is derived from the training data.

This example describes the tuning steps to approximate the function

$\frac{\mathrm{sin}\left(2\mathit{x}\right)}{\mathrm{exp}\left(\frac{\mathit{x}}{5}\right)}$,

where the input $\mathit{x}$ varies from 0 to 10.

Generate training data.

```x = (0:0.1:10)'; y = sin(2*x)./exp(x/5);```

Create options for `genfis` that specify five MFs, a Gaussian MF for the input, and a constant MF for the output.

```goptions = genfisOptions('GridPartition','NumMembershipFunctions',5, ... 'InputMembershipFunctionType','gaussmf', ... 'OutputMembershipFunctionType','constant');```

Generate the initial FIS, and get its parameter settings.

```fisin = genfis(x,y,goptions); [in,out,rule] = getTunableSettings(fisin);```

Use pattern search method for optimization, setting the maximum number of iterations to 25, and tune the FIS.

```toptions = tunefisOptions('Method','patternsearch'); toptions.MethodOptions.MaxIterations = 25; rng('default') fisout = tunefis(fisin,[in;out],x,y,toptions);```
```Iter Func-count f(x) MeshSize Method 0 1 0.346649 1 1 19 0.346649 0.5 Refine Mesh 2 28 0.295219 1 Successful Poll 3 34 0.295069 2 Successful Poll 4 48 0.295069 1 Refine Mesh 5 56 0.295064 2 Successful Poll 6 71 0.294986 4 Successful Poll 7 82 0.294986 2 Refine Mesh 8 98 0.294986 1 Refine Mesh 9 112 0.293922 2 Successful Poll 10 128 0.293922 1 Refine Mesh 11 131 0.29151 2 Successful Poll 12 144 0.290141 4 Successful Poll 13 156 0.290141 2 Refine Mesh 14 171 0.290006 4 Successful Poll 15 184 0.290006 2 Refine Mesh 16 200 0.290006 1 Refine Mesh 17 207 0.289743 2 Successful Poll 18 223 0.289743 1 Refine Mesh 19 243 0.289743 0.5 Refine Mesh 20 257 0.286935 1 Successful Poll 21 260 0.282278 2 Successful Poll 22 263 0.281878 4 Successful Poll 23 267 0.280144 8 Successful Poll 24 272 0.280144 4 Refine Mesh 25 278 0.275167 8 Successful Poll 26 284 0.275167 4 Refine Mesh Maximum number of iterations exceeded: increase options.MaxIterations. ```

You can increase the number of iterations to further optimize the cost.

### Validate Training Results

To avoid overfitting of training data sets, check the performance of a trained fuzzy system with validation data. One common approach is to divide available data into training and validation data sets so that both data sets include similar characteristics.

If training and validation performances differ significantly, you can:

• Change the maximum number of tuning iterations. For example, if the training performance is better than validation performance, you can reduce the number of tuning iterations.

• Resample the training and validation data sets to maintain homogeneity.

To see how the validation performance can vary based on nonhomogeneous sampling, create a single-input-single-output fuzzy inference system to model the function

$\frac{\mathrm{sin}\left(2\mathit{x}\right)}{\mathrm{exp}\left(\frac{\mathit{x}}{5}\right)}$,

where the input varies from 0 to 10 and the output range is [0 1]. Add five default MFs to the input and output. Get the input and output parameter settings.

```fisin = sugfis; fisin = addInput(fisin,[0 10],'NumMFs',5,'MFType','gaussmf'); fisin = addOutput(fisin,[0 1],'NumMFs',5); [in,out] = getTunableSettings(fisin);```

Create `tunefis` options for `learning` with the `particleswarm` optimization method, and set the maximum number of iterations to 5.

```options = tunefisOptions('Method','particleswarm','OptimizationType','learning'); options.MethodOptions.MaxIterations = 5;```

Divide the available data into two sets without maintaining homogeneity in the data sets.

```x = (0:0.1:10)'; n = numel(x); midn = floor(n/2); trnX = x(1:midn); vldX = x(midn+1:end); f = @(x)(sin(2*x)./exp(x/5)); trnY = f(trnX); vldY = f(vldX);```

Tune the FIS parameters.

```rng('default') fisout = tunefis(fisin,[in;out],trnX,trnY,options);```
``` Best Mean Stall Iteration f-count f(x) f(x) Iterations 0 100 0.4279 0.5932 0 1 200 0.3846 0.6183 0 2 300 0.3751 0.5675 0 3 400 0.3606 0.568 0 4 500 0.3606 0.5596 1 5 600 0.3598 0.5307 0 Optimization ended: number of iterations exceeded OPTIONS.MaxIterations. ```

Find the optimization cost for validation data using the `findcost` function defined at the end of this example.

`vldCost = findcost(fisout,vldX,vldY)`
```vldCost = 0.1780 ```

Note that the validation cost differs from the best training cost shown in column 3 of the tuning result.

Next, resample the available data, and create two homogeneous data sets for training and validation.

```trnX = x(1:2:end); vldX = x(2:2:end); f = @(x)(sin(2*x)./exp(x/5)); trnY = f(trnX); vldY = f(vldX);```

Tune the FIS with the new training data.

```rng('default') fisout = tunefis(fisin,[in;out],trnX,trnY,options);```
``` Best Mean Stall Iteration f-count f(x) f(x) Iterations 0 100 0.3445 0.5852 0 1 200 0.2996 0.5616 0 2 300 0.2907 0.5381 0 3 400 0.2878 0.5334 0 4 500 0.2878 0.5624 1 5 600 0.2877 Inf 0 Optimization ended: number of iterations exceeded OPTIONS.MaxIterations. ```

Find the optimization cost for the new validation data.

`vldCost = findcost(fisout,vldX,vldY)`
```vldCost = 0.2803 ```

In this case, homogeneous sampling reduces the difference between the training and validation costs.

### Ways to Improve Tuning Results

You can improve the training error of a tuned fuzzy systems by following these guidelines.

• Use multiple steps in a tuning process. For example, first learn the rules of a fuzzy system, and then tune input/output MF parameters using the learned rule base. You can also separately tune individual FIS parameters in a FIS tree and then tune all the fuzzy systems together to generalize the parameter values.

• Increase the number of iterations in both the rule-learning and parameter-tuning phases. Doing so increases the duration of the optimization process. It can also increase validation error due to overtuned system parameters with the training data.

• Use global optimization methods, such as `ga` and `particleswarm`, in both rule-learning and parameter-tuning phases. Global optimizers `ga` and `particleswarm` perform better for large parameter tuning ranges than local optimizers. The local optimizers `patternsearch` and `simulannealbnd` perform better for small parameter ranges. If rules are already added to a FIS tree using training data, then `patternsearch` and `simulannealbnd` can produce faster convergence compared to `ga` and `particleswarm`. For more information on these optimization methods and their options, see `ga`, `particleswarm`, `patternsearch`, and `simulannealbnd`.

• Change the clustering technique used by `genfis`. Depending on the clustering technique, the generated rules can differ in their representation of the training data. Hence, the use of different clustering techniques can affect the performance of `tunefis`.

• Change FIS properties. Try changing properties such as the type of FIS, number of inputs, number of input/output MFs, MF types, and number of rules. A Sugeno system has fewer output MF parameters (assuming constant MFs) and faster defuzzification. Therefore, for fuzzy systems with a large number of inputs, a Sugeno FIS generally converges faster than a Mamdani FIS. Small numbers of MFs and rules reduce the number of parameters to tune, producing a faster tuning process. Furthermore, a large number of rules might overfit the training data.

• Modify tunable parameter settings for MFs and rules. For example, you can tune the support of a triangular MF without changing its peak location. Doing so reduces the number of tunable parameters and can produce a faster tuning process for specific applications. For rules, you can exclude zero MF indices by setting the `AllowEmpty` tunable setting to `false`, which reduces the overall number of rules during the learning phase.

• Change FIS tree properties, such as the number of fuzzy systems and the connections between the fuzzy systems.

• Use different ranking and grouping of the inputs to the FIS tree. For more information about creating FIS trees, see Fuzzy Trees.

### Local functions

```function cost = customcostfcn(fis,x,y) tY = evalfis(fis,x); sincosx = tY(:,1)+tY(:,2); sincosexpx = sincosx.*tY(:,3); actY = [sincosx;sincosexpx]; d = y(:)-actY; cost = sqrt(mean(d.*d)); end function fis = customtunefis(fis,rule,x,y,n,ignore) % Show initial cost. cost = findcost(fis,x,y); fprintf('Initial cost = %f\n',cost); % Optimize rule parameters. numMFs = numel(fis.Outputs.MembershipFunctions); for ite = 1:n for i = 1:numel(rule) % Get consequent value. pval = getTunableValues(fis,rule(i)); % Loop through output MF indices to minimize the cost. % Use output indices according to AllowNot and AllowEmpty. allowNot = rule(i).Consequent.AllowNot; allowEmpty = rule(i).Consequent.AllowEmpty; if allowNot && allowEmpty mfID = -numMFs:numMFs; elseif allowNot && ~allowEmpty mfID = [-numMFs:-1 1:numMFs]; elseif ~allowNot && allowEmpty mfID = 0:numMFs; else mfID = 1:numMFs; end cost = 1000; minCostFIS = fis; for j = 1:length(mfID) % Update consequent value. pval(1) = mfID(j); % Set updated consequent value to the FIS. fis = setTunableValues(fis,rule(i),pval,'IgnoreInvalidParameters',ignore); % Evaluate cost. rmse = findcost(fis,x,y); % Update FIS with the minimum cost. if rmse<cost cost = rmse; minCostFIS = fis; end end fis = minCostFIS; end fprintf('Iteration %d: Cost = %f\n',ite,cost); end end function cost = findcost(fis,x,y) actY = evalfis(fis,x); d = y - actY; cost = sqrt(mean(d.*d)); end```