Azzera filtri
Azzera filtri

Minimizing a prebuilt cost function

10 visualizzazioni (ultimi 30 giorni)
I hope this reaches everyone well.
I have been attempting to minimize a complex function, deependent on a 6x7 inital guess matrix. I have built code that will output a weighted least squares difference between the expiremental and predicted data. Is there a way to use fmincon, fminsearch, etc... to minimize this value formed via the cost function?
To sumarize, I have a model that I transformed into a function with its only input being that 6x7 inital guess matrix, which outputs a value that exhibits the difference between the numerical simulated and expiremental. I wish to minimize this value, using fmincon, or any other solver to form guesses input into this function.
Thank you for your time!
Kevin
  10 Commenti
Matt J
Matt J il 10 Feb 2023
Modificato: Matt J il 10 Feb 2023
Yes to all! The absolute difference between, TsWuSph(x0) - cfinal(3,3), is what I wish to minimize.
Since cfinal(3,3) is a scalar value, that would be equivalent to solving for multiple unknowns x0 given a single equation. It is a considerably under-determined problem.
Kevin Hanekom
Kevin Hanekom il 10 Feb 2023
Modificato: Kevin Hanekom il 10 Feb 2023
Thank you for the input Matt. I apoligize for the confusion, in this case x0 is a single variable, I am inputing into the function I have defined called TsWuSph. This function outputs an expected numerical value, which I wish to minimize in comparison to expiremental, scalar, value cfinal(3,3). Just to sumarize, x0 should only be a single unkown output in this case.

Accedi per commentare.

Risposta accettata

Kevin Hanekom
Kevin Hanekom il 10 Feb 2023
My probelm was a classic example of derivative based algorithms convergence to a local, but not global minimum. To solve this one can use a heuristic, or population based algorithm, in this case either GA or the annealing method as listed in this great textbook, MIT Book.
Thank you everyone for your help.
  2 Commenti
Matt J
Matt J il 10 Feb 2023
It is really unlikely you would do that just to avoid local minima for a 1-parameter problem. You would probably just sample the function over a range of points and use min
c=cfinal(3,3);
fun= @(x0) abs(TsWuSph(x0)-c);
x=linspace(a,b);
[~,i]=min(arrayfun(fun, x));
Guess=x(i);
Kevin Hanekom
Kevin Hanekom il 10 Feb 2023
The one parameter problem was just a inital simplification of the much more complex probelm statement. I am sure the min function would work for the one parameter problem. Thank you for your help through my problem.

Accedi per commentare.

Più risposte (1)

Matt J
Matt J il 10 Feb 2023
Modificato: Matt J il 10 Feb 2023
Just to sumarize, x0 should only be a single unkown output in this case.
If so, both lsqnonlin and fmincon are overkill. You should just use fminbnd or fminsearch, e.g.,
c=cfinal(3,3);
[x, fval] = fminsearch( @(x0) abs(TsWuSph(x0)-c) , Guess)
  2 Commenti
Kevin Hanekom
Kevin Hanekom il 10 Feb 2023
Thank you for the input, using this I recieved the following output.
Elapsed time is 0.202791 seconds.
Iteration Func-count min f(x) Procedure
0 1 52.2326
Elapsed time is 0.169855 seconds.
1 2 52.2326 initial simplex
Optimization terminated:
the current x satisfies the termination criteria using OPTIONS.TolX of 1.000000e-04
and F(X) satisfies the convergence criteria using OPTIONS.TolFun of 1.000000e-04
It seems the code is not attempting to minimize f(x), which unless I am mistaken, should be attempting to get as close to 0 as possible.
Here is the exact code I used.
%% Organizing all values into a "Guess" matrix
Guess = [F(1,2)];
options = optimset('Display','iter');
c=cfinal(3,3);
[x, fval] = fminsearch( @(x0) abs(TsWuSph(x0)-c) , Guess, options)

Accedi per commentare.

Categorie

Scopri di più su Get Started with Optimization Toolbox in Help Center e File Exchange

Prodotti


Release

R2021a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by