Document detail
ID

oai:arXiv.org:2410.08033

Topic
Mathematics - Optimization and Con... Electrical Engineering and Systems...
Author
Agarwal, Aayushya Pileggi, Larry Rohrer, Ronald
Category

Computer Science

Year

2024

listing date

10/16/2024

Keywords
fast variables methods systems convergence optimization
Metrics

Abstract

Second-order optimization methods exhibit fast convergence to critical points, however, in nonconvex optimization, these methods often require restrictive step-sizes to ensure a monotonically decreasing objective function.

In the presence of highly nonlinear objective functions with large Lipschitz constants, increasingly small step-sizes become a bottleneck to fast convergence.

We propose a second-order optimization method that utilizes a dynamic system model to represent the trajectory of optimization variables as an ODE.

We then follow the quasi-steady state trajectory by forcing variables with the fastest rise time into a state known as quiescence.

This optimization via quiescence allows us to adaptively select large step-sizes that sequentially follow each optimization variable to a quasi-steady state until all state variables reach the actual steady state, coinciding with the optimum.

The result is a second-order method that utilizes large step-sizes and does not require a monotonically decreasing objective function to reach a critical point.

Experimentally, we demonstrate the fast convergence of this approach for optimizing nonconvex problems in power systems and compare them to existing state-of-the-art second-order methods, including damped Newton-Raphson, BFGS, and SR1.

Agarwal, Aayushya,Pileggi, Larry,Rohrer, Ronald, 2024, Second-Order Optimization via Quiescence

Document

Open

Share

Source

Articles recommended by ES/IODE AI

Critical Prognostic Factors in Cerebral Venous Sinus Thrombosis: An Observational Study
thrombosis 001 p<0 involvement sinus prognostic study factors outcome poor associated