The seminal paper by Barzilai and Borwein (1988) has given rise to an extensive investigation, leading to the development of effective gradient methods. Several steplength rules have been first designed for unconstrained quadratic problems and then extended to general nonlinear optimization problems. These rules share the common idea of attempting to capture, in an inexpensive way, some second-order information. However, the convergence theory of the gradient methods using the previous rules does not explain their effectiveness, and a full understanding of their practical behaviour is still missing. In this work we investigate the relationships between the steplengths of a variety of gradient methods and the spectrum of the Hessian of the objective function, providing insight into the computational effectiveness of the methods, for both quadratic and general unconstrained optimization problems. Our study also identifies basic principles for designing effective gradient methods.

On the steplength selection in gradient methods for unconstrained optimization / Di Serafino, Daniela; Ruggiero, Valeria; Toraldo, Gerardo; Zanni, Luca. - In: APPLIED MATHEMATICS AND COMPUTATION. - ISSN 0096-3003. - 318:(2018), pp. 176-195. [10.1016/j.amc.2017.07.037]

On the steplength selection in gradient methods for unconstrained optimization

ZANNI, Luca
2018

Abstract

The seminal paper by Barzilai and Borwein (1988) has given rise to an extensive investigation, leading to the development of effective gradient methods. Several steplength rules have been first designed for unconstrained quadratic problems and then extended to general nonlinear optimization problems. These rules share the common idea of attempting to capture, in an inexpensive way, some second-order information. However, the convergence theory of the gradient methods using the previous rules does not explain their effectiveness, and a full understanding of their practical behaviour is still missing. In this work we investigate the relationships between the steplengths of a variety of gradient methods and the spectrum of the Hessian of the objective function, providing insight into the computational effectiveness of the methods, for both quadratic and general unconstrained optimization problems. Our study also identifies basic principles for designing effective gradient methods.
2018
1-feb-2018
318
176
195
On the steplength selection in gradient methods for unconstrained optimization / Di Serafino, Daniela; Ruggiero, Valeria; Toraldo, Gerardo; Zanni, Luca. - In: APPLIED MATHEMATICS AND COMPUTATION. - ISSN 0096-3003. - 318:(2018), pp. 176-195. [10.1016/j.amc.2017.07.037]
Di Serafino, Daniela; Ruggiero, Valeria; Toraldo, Gerardo; Zanni, Luca
File in questo prodotto:
File Dimensione Formato  
Steplength_selection_AMC_2017.pdf

Open access

Descrizione: Articolo principale
Tipologia: Versione originale dell'autore proposta per la pubblicazione
Dimensione 4.15 MB
Formato Adobe PDF
4.15 MB Adobe PDF Visualizza/Apri
VOR_On the steplength selection in gradient methods.pdf

Accesso riservato

Tipologia: Versione pubblicata dall'editore
Dimensione 2.35 MB
Formato Adobe PDF
2.35 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

Licenza Creative Commons
I metadati presenti in IRIS UNIMORE sono rilasciati con licenza Creative Commons CC0 1.0 Universal, mentre i file delle pubblicazioni sono rilasciati con licenza Attribuzione 4.0 Internazionale (CC BY 4.0), salvo diversa indicazione.
In caso di violazione di copyright, contattare Supporto Iris

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11380/1146848
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 76
  • ???jsp.display-item.citation.isi??? 69
social impact