Accelerating Sequential Quadratic Approximation for Regularized Optimization in Convergence Speed and in Running Time



    Department of Systems Engineering and Engineering Management

                       The Chinese University of Hong Kong


Date: Friday, February 10, 4:30 pm – 6:00 pm

Venue: ERB 513, The Chinese University of Hong Kong

Title: Accelerating Sequential Quadratic Approximation for Regularized Optimization in Convergence Speed and in Running Time

Speaker: Prof. LEE Ching-pei, Academia Sinica



For regularized optimization that minimizes the sum of a differentiable term and a regularizer that promotes structured solutions, inexact proximal-Newton-type methods, or more generally successive quadratic approximation (SQA) methods, that iteratively (and thus inexactly) solve a subproblem using quadratic approximations of the smooth term to update the iterates are widely used for their fast convergence and superior empirical performance.

This talk discusses our recent improvements of SQA in both theory and practice and consists of two parts.

In the first part, we discuss its superlinear behavior on degenerate problems. We show that local superlinear convergence and even quadratic convergence of the distance between the iterates and the solution set is attainable without requiring the Hessian of the smooth term being positive definite nor Lipschitz continuous, provided a Holderian error bound (HEB) condition holds and the Hessian is p-Holderian continuous for a certain range of p, and this in turn leads to strong convergence of the iterate sequence. We then propose a novel globalization strategy that maintains such fast local convergence and further guarantees strict decrease of the objective function value using a growth condition derived from the HEB. A particularly novel feature of our algorithm is that it also achieves superlinear convergence for the objective value.

In the second part of this talk, we consider algorithmic accelerations for SQA through manifold identification when the regularization term is partly smooth at a solution. More specifically, we first show that for partly smooth regularizers, although general inexact subproblem solutions cannot identify the active manifold along which the nonsmooth regularizer becomes smooth, approximate solutions generated by commonly-used subproblem solvers will identify this manifold, even with arbitrarily low solution precision. We then utilize this property to propose an improved SQA method that switches to efficient smooth optimization methods after the active manifold is identified. We show that for a wide class of problems, the proposed method possesses superlinear convergence in terms of not only iterations but also running time. We then demonstrate through numerical experiments in large-scale logistic regression that the proposed algorithm outperforms state of the art.



LEE Ching-pei is an assistant research fellow at Academia Sinica. Prior to that, Ching-pei was a Peng Tsu Ann Assistant Professor of the Department of Mathematics at the National University of Singapore during 2019-2020. Before NUS, Ching-pei received a PhD and an M.S. degree in Computer Sciences from the University of Wisconsin-Madison, and a B.B.A. degree (in Finance) from National Taiwan University.

Ching-pei's research focuses on efficient algorithms for large-scale problems in nonlinear optimization and their applications in machine learning with big data and big models, with a particular interest in developing and providing theoretical guarantees for methods that are empirically superior on real-world problems.

Dr. Lee is the recipient of the Emerging Young Scholars award of the Ministry of Science and Technology of Taiwan, the Grand Challenge Seed Grant Fellowship of Academia Sinica, and the Young Scholar award of the Foundation for the Advancement of Outstanding Scholarship. 



Everyone is welcome to attend the talk!

SEEM-5202 Website:


Friday, February 10, 2023 - 16:30 to 18:00