The conjugate gradient method for unconstrained optimization - how to restart

6 visualizzazioni (ultimi 30 giorni)
Hello,
I am trying to implement the conjugate method. But, I do know how to restart de process when n iteractions is achieved (n is the number of variables).
function [x_opt,f_opt,k,t,q] = conjugate_gradient(fob,g_fob,x0,tol_grad);
tic
n= size(x0);
c0 = feval(g_fob,x0);
k = 0;
t=0;
q=0;
if norm(c0) < tol_grad
x_opt = x0;
f_opt = feval(fob,x_opt);
else
d= -c0; % search direction
alfa0 = equal_interval_line_search(x0,d,fob,0.5,1e-6); % step size (line search)
x1= x0+ alfa0*d;
c1 = feval(g_fob,x1);
while norm(c1) > tol_grad
if k < n(2)
beta = (norm(c1)/norm(c0))^2;
d= -c1+beta*d;
alfa1 = equal_interval_line_search(x1,d,fob,0.5,1e-6);
x2= x1+alfa1*d;
c0=c1;
c1= feval(g_fob,x2);% gradiente no ponto corrente
x1=x2;
else
x0=x1;
c0=c1;
d=-c1;
alfa0 = equal_interval_line_search(x0,d,fob,0.5,1e-6);
x1= x0+ alfa0*d;
c1 = feval(g_fob,x1);
while norm(c1) > tol_grad
beta = (norm(c1)/norm(c0))^2;
d= -c1+beta*d;
alfa1 = equal_interval_line_search(x1,d,fob,0.5,1e-6);
x2= x1+alfa1*d;
c0=c1;
c1= feval(g_fob,x2);
x1=x2;
q=q+1;
end
end
k=k+1;
end
t=q+k;
x_opt = x1;
f_opt = feval(fob,x_opt);
end
toc

Risposte (1)

Anay
Anay il 26 Ago 2025
Hi Matheus,
I understand that you are trying to implement conjugate gradient method with a restart strategy when the number of iterations reaches the dimension of the problem. There are a few issues with your current implementation.
  • The nested while loop inside the else branch creates an infinite loop if the tolerance isn't met: The issue with this code is that when k >= n(2), it enters the else branch and starts a new while loop that checks the same condition as the outer loop (norm(c1) > tol_grad).
  • Restart logic can be simplified
The standard conjugate gradient method theoretically converges in at most n iterations for quadratic functions. For non-quadratic functions, restarting every n iterations helps maintain the effectiveness of the method by resetting the search directions. You can make the following changes:
  • Fix the potential infinite loop.
  • Instead of nested loops, a modulo operation mod(k, n) == 0 to check if we need to restart will be more efficient. This happens every n iterations.
Please refer to the below example code:
while norm(c1) > tol_grad
% Restart if iterations reach problem dimension
% “k” is all iterations, “q” is the number of restarts,
% “t” is the total number of iterations.
if mod(k, n) == 0
d = -c1; % Reset direction to negative gradient
if k > 0
q = q + 1; % Count restarts
end
else
beta = (norm(c1)/norm(c0))^2;
d = -c1 + beta*d;
end
% Line search
alfa = equal_interval_line_search(x1, d, fob, 0.5, 1e-6);
% Update
x2 = x1 + alfa*d;
c0 = c1;
c1 = feval(g_fob, x2);
x1 = x2;
k = k + 1;
end
If you need to modify the restart frequency or implement other restart strategies, you can adjust the restart condition accordingly.
I hope this helps!

Categorie

Scopri di più su Statistics and Machine Learning Toolbox in Help Center e File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by