Home » Publication » 22078

Dettaglio pubblicazione

2020, ATTI DELLA ACCADEMIA PELORITANA DEI PERICOLANTI, CLASSE DI SCIENZE FISICHE, MATEMATICHE E NATURALI, Pages 1-16 (volume: 98/S2)

Convergence rate for diminishing stepsize methods in nonconvex constrained optimization via ghost penalties (01a Articolo in rivista)

Facchinei Francisco, Kungurtsev Vyacheslav, Lampariello Lorenzo, Scutari Gesualdo

This is a companion paper to “Ghost penalties in nonconvex constrained optimization: Diminishing stepsizes and iteration complexity" (to appear in Mathematics of Operations Research). We consider the ghost penalty scheme for nonconvex, constrained optimization introduced in that paper, coupled with a diminishing stepsize procedure. Under an extended Mangasarian-Fromovitz-type constraint qualification we give an expression for the maximum number of iterations needed to achieve a given solution accuracy according to a natural stationarity measure, thus establishing the first result of this kind for a diminishing stepsize method for nonconvex, constrained optimization problems.
Gruppo di ricerca: Continuous Optimization
keywords
© Università degli Studi di Roma "La Sapienza" - Piazzale Aldo Moro 5, 00185 Roma