The optimization algorithms

Algorithms (Newton, Quasi-Newton as well as gradient descent methods) are implemented by considering some objective functions (1D and 2D).
133 download
Aggiornato 18 apr 2023

Visualizza la licenza

The optimization algorithms (Newton, Quasi-Newton as well as gradient descent methods) are implemented by considering objective functions in one and two dimensions. The Newton and quasi-Newton methods may encounter problems such as the Hessian is too complex or does not exist. The requirement to apply a matrix inversion at each iteration, this can be prohibitive for optimization problems involving many variables. These methods can therefore become impractical. An alternative is to use the family of gradient descent algorithms. These methods do not require explicit computation or Hessian approximation. A gradient descent algorithm is implemented by choosing successive descent directions and the amplitude of the descent step in the chosen direction. This family of algorithms is widely used in optimization processes of more or less complex problems. The term descent arises because these algorithms look for the extrema in an opposite direction to that of the objective function's gradient.
Explanatory algorithmic schemes are available in the user guide.

Cita come

Kenouche Samir (2024). The optimization algorithms (https://www.mathworks.com/matlabcentral/fileexchange/128008-the-optimization-algorithms), MATLAB Central File Exchange. Recuperato .

Compatibilità della release di MATLAB
Creato con R2023a
Compatibile con qualsiasi release
Compatibilità della piattaforma
Windows macOS Linux

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!
Versione Pubblicato Note della release
04.2023.01