Schedule for: 25w5372 - Strategies for Handling Applications with Nonconvexity

Beginning on Sunday, May 11 and ending Friday May 16, 2025

All times in Banff, Alberta time, MDT (UTC-6).

Sunday, May 11
16:00 - 17:30 Check-in begins at 16:00 on Sunday and is open 24 hours (Front Desk - Professional Development Centre)
17:30 - 19:30 Dinner
A buffet dinner is served daily between 5:30pm and 7:30pm in Vistas Dining Room, top floor of the Sally Borden Building.
(Vistas Dining Room)
20:00 - 22:00 Informal gathering (TCPL Foyer)
Monday, May 12
07:00 - 08:45 Breakfast
Breakfast is served daily between 7 and 9am in the Vistas Dining Room, the top floor of the Sally Borden Building.
(Vistas Dining Room)
08:45 - 09:00 Introduction and Welcome by BIRS Staff
A brief introduction to BIRS with important logistical information, technology instruction, and opportunity for participants to ask questions.
(TCPL 201)
09:00 - 09:30 Veit Elser: Life after the Book*
I will share the solution to the last exercise, the final application, and of course the cliffhanger ending. $$ $$ * Solving Problems with Projections: From phase retrieval to packing
(TCPL 201)
09:30 - 10:00 Courtney Paquette (TCPL 201)
10:00 - 10:30 Coffee Break (TCPL Foyer)
10:30 - 11:00 Ahmet Alacaoglu: Handling Generalized Monotone Operators with First-Order Algorithms
This talk will focus on first-order algorithms for finding zeros of monotone or generalized monotone operators, where we use generalized monotonicity in view of the work of Bauschke, Moursi and Wang [Math. Program. 189 (2021), pp. 55-74]. We will consider the complexity analysis of inexact fixed-point algorithms. We will discuss the insights this analysis gives for improving the level of nonmonotonicity that can be tolerated by first-order algorithms, as well as improving the oracle complexity for stochastic problems. We also discuss the consequences of these results for min-max problems. $$ $$ This talk is based on the joint work with Donghwan Kim and Stephen Wright.
(TCPL 201)
11:00 - 11:30 Ziyuan Wang: Bregman level proximal subdifferential and new insights into Bregman proximal operators
Classic subdifferentials in variational analysis may fail to fully represent the Bregman proximal operator in the absence of convexity. $$ $$ In this talk, we introduce the left and right Bregman level proximal subdifferentials and investigate them systematically. Every Bregman proximal operator turns out to be the resolvent of a Bregman level proximal subdifferential under a standard range assumption, even without convexity. Aided by this pleasant feature, we establish new correspondences among useful properties of the Bregman proximal operator, the underlying function, and the (left) Bregman level proximal subdifferential, generalizing classical equivalences in the Euclidean case. Unlike the classical setting, asymmetry and duality gap emerge as natural consequences of the Bregman distance. We also characterize the existence and single-valuedness of the Bregman level proximal subdifferential, investigate coincidence results, and make an interesting connection to relative smoothness.
(TCPL 201)
11:30 - 13:00 Lunch
Lunch is served daily between 11:30am and 1:30pm in the Vistas Dining Room, the top floor of the Sally Borden Building.
(Vistas Dining Room)
13:00 - 13:30 Stephen Vavasis (TCPL 201)
13:30 - 14:00 Shambhavi Singh (TCPL 201)
14:00 - 14:30 Sedi Bartz: Convex analysis in multi-marginal settings
We consider aspects of the convex analytic structure that lays at the foundations of multi-marginal optimal transport. We present extensions of classical convex analysis and monotone operator theory, recent progress, open questions, and ongoing work.
(TCPL 201)
14:30 - 15:00 Jelena Diakonikolas (TCPL 201)
15:00 - 15:30 Coffee Break (TCPL Foyer)
15:30 - 16:00 Stephen Simons: Faces of quasidensity
This talk is about the maximally monotone and quasidense subsets of the product of a real Banach space and its dual. We discuss eight subclasses of the maximally monotone sets that are equivalent to the quasidense ones. We define the Gossez extension to the dual of a maximally monotone set, and give seven equivalent characterizations of an element of this set in the quasidense case. We discuss maximally monotone sets of "type (NI)" (one of the eight classes referred to above) and we show that the "tail operator" is not of type (NI), but it is the Gossez extension of a maximally monotone set that is of type (NI). We generalize Rockafellar's surjectivity theorem for maximally monotone subsets of reflexive Banach spaces to maximally monotone subsets of type (NI) of general Banach spaces. We discuss a generalization of the Brezis-Browder theorem on monotone linear subspaces of reflexive spaces to the nonreflexive situation.
(TCPL 201)
16:00 - 16:30 David Torregrosa Belén: Enhanced randomized block proximal gradient methods beyond global Lipschitz continuity
Block-coordinate algorithms are recognized to furnish efficient iterative schemes for addressing large-scale problems, specially when the obtainment of full derivatives entails substantial memory requirements and computational efforts. In this talk, we analyze a randomized block proximal gradient algorithm for addressing the sum of a differentiable function and a separable proper lower-semicontinuous function, both possibly nonconvex. In contrast to previous works, we require the partial gradients of the differentiable function to be Lipschitz continuous only locally. At each iteration, the proposed scheme selects a proximal stepsize adaptively using a linesearch that can be performed in a nonmonotone fashion. After that, an additional linesearch is conducted to obtain an improved descent condition. The combination of both linesearches leads to enhanced performance, as illustrated in an experiment in nonnegative matrix factorization for image compression.
(TCPL 201)
16:30 - 17:00 Yuan Gao (TCPL 201)
17:00 - 17:30 Viktor Pavlovic (TCPL 201)
17:30 - 19:30 Dinner
A buffet dinner is served daily between 5:30pm and 7:30pm in Vistas Dining Room, top floor of the Sally Borden Building.
(Vistas Dining Room)
Tuesday, May 13
07:00 - 08:45 Breakfast
Breakfast is served daily between 7 and 9am in the Vistas Dining Room, the top floor of the Sally Borden Building.
(Vistas Dining Room)
09:00 - 09:30 Yura Malitsky: Entropic Mirror Descent for Linear Systems: Polyak’s Stepsize and Implicit Bias
In this talk, we explore the use of entropic mirror descent for solving linear systems. We discuss the motivation behind this approach, such as its implicit bias, and analyze its convergence properties. Although the method has a simple structure, its convergence behavior is quite subtle. We demonstrate how a Polyak-type stepsize can help overcome these challenges and lead to an explicit rate. If time permits, we also show some generalization of the entropic mirror descent.
(TCPL 201)
09:30 - 10:00 Renata Sotirov: Lagrangian duality for Mixed-Integer Semidefinite Programming
Mixed-integer semidefinite programming can be viewed as a generalization of mixed-integer programming where the vector of variables is replaced by mixed-integer positive semidefinite matrix variables. The combination of positive semidefiniteness and integrality allows to formulate various nonlinear optimization problems as mixed-integer semidefinite programs (MISDPs). In this talk we show that MISDPs induce bounds based on Lagrangian duality theory. By introducing MISDP-based projected bundle and subgradient algorithms, we show that the resulting Lagrangian dual bounds are stronger than the standard semidefinite programming bounds for various optimization problems.
(TCPL 201)
10:00 - 10:30 Coffee Break (TCPL Foyer)
10:30 - 11:00 Aris Daniilidis: Pathological differentiable locally Lipschitz functions
We focus on the difference between differentiable versus strict differentiable locally Lipschitz functions from the view point of nonsmooth analysis: while in the latter class, all limiting Jacobians are singletons, we show that there exists a differentiable locally Lipschitz function whose limiting Jacobian represents all nonempty compact connected subsets of matrices. In the particular case of real-valued functions, we obtain differentiable functions with surjective limiting and Clarke subdifferentias. In this case, our concrete example-scheme will also reveal that the class of such pathological locally Lipschitz differentiable functions is dense (for the topology of the uniform convergence) and spaceable (for the Lip-norm topology). The talk is based on the works: A. Daniilidis, R. Deville, S. Tapia-Garcia: All convex bodies are in the subdifferential of some everywhere differentiable locally Lipschitz function, Proc. London Math. Society (3) 129 (2024), no. 5, Paper No. e70007, 27 pp., DOI http://dx.doi.org/10.1112/plms.70007 A. Daniilidis, S. Tapia-Garcia: Differentiable functions with surjective Clarke Jacobians hal-04952836 (Preprint 15p, 2025)
(TCPL 201)
11:00 - 11:28 Robert Csetnek: Tikhonov regularization for monotone operators
We consider the asymptotic behaviour of the trajectories of a second order dynamical system with Tikhonov regularization for solving a monotone equation with single valued, monotone and continuous operator acting on a real Hilbert space. We consider a vanishing damping which is correlated with the Tikhonov parameter and which is in line with recent developments in the literature on this topic. A correction term which involves the time derivative of the operator along the trajectory is also involved in the system and makes the link with Newton and Levenberg-Marquardt type methods. We obtain strong convergence of the trajectory to the minimal norm solution and fast convergence rates for the velocity and a quantity involving the operator along the trajectory. The rates are very closed to the known fast convergence results for systems without Tikhonov regularization, the novelty with respect to this is that we also obtain strong convergence of the trajectory to the minimal norm solution. As an application we introduce a primal-dual dynamical system for solving linearly constrained convex optimization problems, where the strong convergence of the trajectories is highlighted together with fast convergence rates for the feasibility measure and function values. $$ $$ The talk is based on joint works with Radu Bot (University of Vienna) and Szilard Laszlo (Technical University Cluj Napoca).
(TCPL 201)
11:29 - 11:30 Group Photo
Meet in foyer of TCPL to participate in the BIRS group photo. The photograph will be taken outdoors, so dress appropriately for the weather. Please don't be late, or you might not be in the official group photo!
(TCPL Foyer)
11:30 - 13:00 Lunch
Lunch is served daily between 11:30am and 1:30pm in the Vistas Dining Room, the top floor of the Sally Borden Building.
(Vistas Dining Room)
13:00 - 13:30 Adriana Nicolae: Approaches to subgradient algorithms in nonpositively-curved spaces
Beyond linear spaces and Riemannian manifolds, geodesic metric spaces of nonpositive curvature provide a suitable framework for developing convex optimization. While Riemannian subgradient algorithms rely on tangent space constructions and local linearization, understanding subgradients and the algorithms based on them is less straightforward in nonpositively-curved spaces. By using horoballs and Busemann functions, we explore subgradient-style methods (including a splitting version) framed within the underlying space, and derive complexity bounds that match those typically obtained. This talk is based on joint work with A. Goodwin (Cornell University), A. S. Lewis (Cornell University), and G. Lopez-Acedo (University of Seville).
(TCPL 201)
13:30 - 14:00 Francisco J. Aragón Artacho: Forward-backward algorithms devised by graphs
In this talk, we will present a methodology for devising forward-backward methods for finding zeros in the sum of a finite number of maximally monotone operators. We extend the techniques from [SIAM J. Optim., 34 (2024), pp. 1569-1594] to cover the case involving a finite number of cocoercive operators, which should be directly evaluated by the algorithm instead of computing their resolvent. The algorithms are induced by three graphs that determine how the algorithm variables interact with each other and how they are combined to compute each resolvent. The hypotheses on these graphs ensure that the algorithms obtained have minimal lifting and are frugal, meaning that the ambient space of the underlying fixed point operator has minimal dimension and that each resolvent and each cocoercive operator is evaluated only once per iteration. This framework not only allows to recover some known methods, but also to generate new ones, as the forward-backward algorithm induced by a complete graph. If time permits, we will present some numerical experiments showing how the choice of graphs influences the performance of the algorithms. $$ $$ This is a joint work with Rubén Campoy and César López-Pastor.
(TCPL 201)
14:00 - 14:30 Madeleine Udell: Online Scaled Gradient Methods
We introduce a framework to accelerate the convergence of gradient-based methods with online learning. The framework learns to scale the gradient at each iteration through an online learning algorithm and provably accelerates gradient-based methods asymptotically. In contrast with previous literature, where convergence is established based on worst-case analysis, our framework provides a strong convergence guarantee with respect to the optimal stepsize for the iteration trajectory. We obtain the first convergence guarantee for the widely-used hypergradient descent method (HDM) and for new variants with heavy-ball momentum (HDM-HB) and Nesterov momentum. HDM-HB significantly outperforms other adaptive first-order methods, and often matches the performance of L-BFGS using less memory and cheaper iterations.
(TCPL 201)
14:30 - 15:00 xianfu wang: Level proximal subdifferential, variational convexity, and beyond
Level proximal subdifferential was introduced by Rockafellar recently for studying proximal mappings of possibly nonconvex functions. We characterize variational convexity of a function by locally firm nonexpansiveness of proximal mappings or locally relative monotonicity of level proximal subdifferential, and use them to study local convergence of proximal gradient method and others for variationally convex functions. Variational sufficiency guarantees that proximal gradient method converges to local minimizers rather than just critical points. As a powerful tool, level proximal subdifferential provides deep insights into variational analysis and optimization. Joint work with Luo, Wang and Yang.
(TCPL 201)
15:00 - 15:30 Coffee Break (TCPL Foyer)
15:30 - 16:00 Shoham Sabach (TCPL 201)
16:00 - 16:30 Enis Chenchene: A Fast Extra-Gradient Method with Flexible Anchoring
We introduce a novel Extra-Gradient method with anchoring governed by general parameters. Our method is derived from an explicit discretization of a dynamical system with Tikhonov regularization, which provides a theoretical foundation for analyzing its convergence properties. We establish strong convergence to specific points within the solution set, as well as convergence rates expressed in terms of the regularization parameters. Notably, our approach recovers the fast residual decay rate $O(k^{-1})$ for standard parameter choices. Numerical experiments highlight the competitiveness of the method and demonstrate how its flexible design enhances practical performance.
(TCPL 201)
16:30 - 17:00 Manish Krishan Lal (TCPL 201)
17:00 - 17:30 Woosuk Jung (TCPL 201)
17:30 - 19:30 Dinner
A buffet dinner is served daily between 5:30pm and 7:30pm in Vistas Dining Room, top floor of the Sally Borden Building.
(Vistas Dining Room)
Wednesday, May 14
07:00 - 08:45 Breakfast
Breakfast is served daily between 7 and 9am in the Vistas Dining Room, the top floor of the Sally Borden Building.
(Vistas Dining Room)
09:00 - 09:30 Vera Roshchina: How to calculate generalised subdifferentials
Generalised subdifferentials provide local optimality conditions for nonsmooth nonconvex functions, and can be used for various other purposes, similar to the gradient and convex subdifferential. Unfortunately calculating generalised subdifferentials can often be a frustrating task, as they mostly don't have good calculus rules.
I will review some useful tricks that can help calculate common generalised subdifferentials of sufficiently structured functions.
(TCPL 201)
09:30 - 10:00 Scott Lindstrom: Gabay Duality for Nonconvex Optimisation
Daniel Gabay showed that when we use the alternating direction method of multipliers to solve a convex primal problem (p), we are implicitly using the the Douglas--Rachford method to solve a convex dual problem (d). We can exploit this duality to prove many things about ADMM, and Gabay's framework is also the platform for various performance improvement schemes that take advantage of the discrete time dynamical system admitted by the Douglas--Rachford operator. In this talk, I will (1) describe how we can use Gabay's duality to study ADMM even for nonconvex problems; (2) show examples where we can use Gabay's duality to show that ADMM's iterates cycle or are chaotic for very natural problems; and (3) explain how we can use Gabay's duality in designing algorithms that show better promise.
(TCPL 201)
10:00 - 10:30 Coffee Break (TCPL Foyer)
10:30 - 11:00 Guoyin Li (TCPL 201)
11:00 - 11:30 Haihao Lu (TCPL 201)
11:30 - 13:00 Lunch
Lunch is served daily between 11:30am and 1:30pm in the Vistas Dining Room, the top floor of the Sally Borden Building.
(Vistas Dining Room)
13:00 - 14:00 EDI Network Opportunities (TCPL 201)
14:00 - 17:30 Free Afternoon (Banff National Park)
17:30 - 19:30 Dinner
A buffet dinner is served daily between 5:30pm and 7:30pm in Vistas Dining Room, top floor of the Sally Borden Building.
(Vistas Dining Room)
Thursday, May 15
07:00 - 08:45 Breakfast
Breakfast is served daily between 7 and 9am in the Vistas Dining Room, the top floor of the Sally Borden Building.
(Vistas Dining Room)
09:00 - 09:30 Jane Ye: New second-order optimality conditions for set-constrained optimization problems
In this talk we present some new second-order optimality conditions for a very general set-constrained optimization problem where the underlying set may be nononvex. Utilizing the classical and/or the lower generalized support function, we obtain new second-order necessary and sufficient conditions via both the corresponding asymptotic second-order tangent cone and outer second-order tangent set. Our necessary optimality conditions do not require convexity and/or nonemptiness of the second-order tangent set and our sufficient conditions do not require convexity and/or nonemptiness of the second-order tangent set and the second-order regularity of the underlying set as required in the classical result. These are important improvements to the classical results in the literature since the second-order tangent set can be nonconvex and empty even when the set is convex and the second-order regularity is a very difficult condition to check.
(TCPL 201)
09:30 - 10:00 Xiaoming Yuan (TCPL 201)
10:00 - 10:30 Coffee Break (TCPL Foyer)
10:30 - 11:00 Matthew Tam: A first-order algorithm for decentralised min-max problems
In this talk, we consider a connected network of finitely many agents working cooperatively to solve a min-max problem with convex-concave structure. We propose a decentralised first-order algorithm which can be viewed as combining features of two algorithms: PG-EXTRA for decentralised minimisation problems and the forward reflected backward method for (non-distributed) min-max problems. In each iteration of our algorithm, each agent computes the gradient of the smooth component of its local objective function as well as the proximal operator of its nonsmooth component, following by a round of communication with its neighbours.
(TCPL 201)
11:00 - 11:30 Bethany Caldwell (TCPL 201)
11:30 - 13:00 Lunch
Lunch is served daily between 11:30am and 1:30pm in the Vistas Dining Room, the top floor of the Sally Borden Building.
(Vistas Dining Room)
13:00 - 13:30 Henry Wolkowicz: A Peaceman-Rachford Splitting Method for the Protein Side-Chain Positioning Problem and other Hard Problems
This talk considers the NP-hard protein side-chain positioning problem, SCP, an important final task of protein structure prediction as a integer quadratic program, IQP. We derive its doubly nonnegative, DNN, (convex) relaxation. Strict feasibility fails for this relaxation and we apply facial reduction, FR as a natural splitting of variables. This allows for a variation of the application of Peaceman-Rachford splitting method, PRSM. The resulting relaxation and rounding procedures provide strong approximate solutions. Empirical evidence shows that almost all our instances of this NP-hard SCP problem, taken from the Protein Data Bank, are solved to provable optimality. We provide further applications to the Wasserstein Barycenter problem.
(TCPL 201)
13:30 - 14:00 Minh Dao (TCPL 201)
14:00 - 14:30 Yalçın Kaya (TCPL 201)
14:30 - 15:00 Hoa Bui (TCPL 201)
15:00 - 15:30 Coffee Break (TCPL Foyer)
15:30 - 17:30 Discussion / Open Problems (TCPL 201)
17:30 - 19:30 Dinner
A buffet dinner is served daily between 5:30pm and 7:30pm in Vistas Dining Room, top floor of the Sally Borden Building.
(Vistas Dining Room)
Friday, May 16
07:00 - 08:45 Breakfast
Breakfast is served daily between 7 and 9am in the Vistas Dining Room, the top floor of the Sally Borden Building.
(Vistas Dining Room)
09:00 - 10:00 Discussion / Open problems / Wrap up (TCPL 201)
10:00 - 10:30 Coffee Break (TCPL Foyer)
10:30 - 11:00 Checkout by 11AM
5-day workshop participants are welcome to use BIRS facilities (TCPL ) until 3 pm on Friday, although participants are still required to checkout of the guest rooms by 11AM.
(Front Desk - Professional Development Centre)
12:00 - 13:30 Lunch from 11:30 to 13:30 (Vistas Dining Room)