Newtonian Method (Optimizing Two Variable Functions)

The algorithm summarizes Newton's Method.
1,9K download
Aggiornato 13 mar 2017

Visualizza la licenza

Newton's method uses information from the Hessian and the Gradient i.e. convexity and slope to compute optimum points. For most quadratic functions it returns the optimum value in just a single search or 2 iterations which is even faster than Conjugate Gradient method. This can be verified by comparing the results with Conjugate Gradient algorithm previously posted by me. However, in some cases for higher order or non-quadratic functions the method might diverge or it may converge to non-minimum stationary points. To guarantee convergence at minima often pre-conditioners are used. The pre-conditioners limit step size increasing the number of computations, but ensuring minimum solution.

Cita come

Soumitra Sitole (2026). Newtonian Method (Optimizing Two Variable Functions) (https://it.mathworks.com/matlabcentral/fileexchange/62012-newtonian-method-optimizing-two-variable-functions), MATLAB Central File Exchange. Recuperato .

Compatibilità della release di MATLAB
Creato con R2016b
Compatibile con qualsiasi release
Compatibilità della piattaforma
Windows macOS Linux
Versione Pubblicato Note della release
1.0.0.0

Update includes the m file.