Steepest descents methods algoritme for higher dimensional objective functions
6 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
Hello,
I am trying to apply the steepest descent method on a function with 10 variables.
With 2 variables it is easy as I can split the problem. Now I tried to write this algoritme for a vector, but without succes.
In the following code you will first see a simple steepest descent algorithm and in the code below you see a similar algoritme based on a vector as input, especially needed to tackle higher dimensional problems, by using the vector notation in this algoritm.
Can I have some feedback.
Clarisha
clc
close
clear
%objective function
b=@(x,y) (1-x).^2+(y-x.^2).^2
%de gradient
dbdx=@(x,y) (2-4*x)-4*x*(y-x.^2)
dbdy=@(x,y) 2*(y-x.^2)
%initials
x0=20
y0=20
%proces
for i=1:10
s1=dbdx(x0,y0);
s2=dbdy(x0,y0);
xd=@(d) x0+d*s1;
yd=@(d) y0+d*s2;
bd=@(d) b(xd(d),yd(d));
d_star=fminsearch(bd,0)
x1=xd(d_star);
y1=yd(d_star) ;
iteratie=i
x0=x1%update initials
y0=y1%update initials
ObjectiveValue=b(x0,y0)
end
%********************************************************************************************************
%Steepest descent method for functions with more input.
B=@(X) (1-X).^2;
DBDX=@(X) -2*(1-X);
X0=[0 0]; %initials
for iteration=1:N
S=DBDX(X0);
XK=@(D) X0+D.*S;
BK=@(D) B(XK(D));
D_STAR=fminsearch(BK,X0);
X=XK(D_STAR)
X0=X
end
3 Commenti
Risposta accettata
Più risposte (1)
Walter Roberson
il 21 Ott 2024
fminsearch() uses simplex algorithm, not Steepest Descent.
One implementation of Steepest Descent is https://www.mathworks.com/matlabcentral/answers/787539-steepest-descent-algorithm-in-matlab#answer_1191330
2 Commenti
Vedere anche
Categorie
Scopri di più su Surrogate Optimization in Help Center e File Exchange
Prodotti
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!