CCOR Workshop – Celebrating Operations Research at Corvinus University: 5th anniversary of CCOR

The Corvinus Centre for Operations Research (CCOR), the Corvinus Institute for Advanced Studies (CIAS) and the Institute of Operations and Decision Sciences invites you to the 

CCOR Workshop titled Celebrating Operations Research at Corvinus University: 5th anniversary of CCOR.

Venue: Corvinus University of Budapest, Building E, Faculty Club 

Date: September 17, 2025 (Wednesday), 14:00 – 17:30 

Keynote speakers: Yinyu Ye (Stanford University), Radu Ioan Bot (University of Vienna)

Speakers: Robert Csetnek (University of Vienna), David Alexander Hulett (University of Vienna)

Programme

14:00-14:10: Opening

14:10-15:00: Yinyu Ye: Mathematical Optimization and Operations Research for AI

15:00-15:30: Robert Csetnek: Fast optimization methods and fast OGDA for monotone equations

15:30-16:00: Coffee break

16:00-16:30: David Alexander Hulett: Acceleration in hierarchical optimization

16:30-17:20: Radu Ioan Bot: Strong convergence and fast residual decay for monotone operator flows via Tikhonov regularization

17:20-17:30: Closing

Abstracts

Yinyu Ye: Mathematical Optimization and Operations Research for AI

This talk aims to present several mathematical optimization and operations research (OR) problems, models, and algorithms for AI such as the LLM training, tunning and inferencing. In particular, we describe how classic OR models/theories, such as online resource allocation, may be applicable to accelerate and improve the Training/Tunning/Inferencing processes that are popularly used in LLMs. On the other hand, we show breakthroughs in classical Optimization (LP and SDP) Solver developments and their applications aided by AI-related techniques and their implementations on GPU.

Robert Csetnek: Fast optimization methods and fast OGDA for monotone
equations

In the framework of real Hilbert spaces we study continuous in time dynamics as well as numerical algorithms for the problem of approaching the set of zeros of a single-valued monotone and continuous operator. The starting point is a second order dynamical system that combines a vanishing damping term with the time derivative of  along the trajectory. We also prove the weak convergence of the trajectory to a zero of the operator. Temporal discretizations of the dynamical system generate implicit and explicit numerical algorithms, which can be both seen as accelerated versions of the Optimistic Gradient Descent Ascent (OGDA) method, for which we prove that the generated sequence of iterates shares the asymptotic features of the continuous dynamics. In particular we show for the implicit numerical algorithm convergence rates for the restricted gap function. For the explicit numerical algorithm we show by additionally assuming that the operator  is Lipschitz continuous convergence rates. All convergence rate statements are last iterate convergence results; in addition we prove for both algorithms the convergence of the iterates to a zero of the operator.

David Alexander Hulett: Acceleration in hierarchical optimization

Problem 1: we minimize an outer function over the set of minimizers of an inner function, where both the inner and outer levels have a composite smooth+nonsmooth structure. Attached to this problem, we formulate two dynamical systems: a first-order one in the spirit of the gradient flow, and a second-order one in the spirit of the Su-Boyd-Candès (2014) dynamics. Besides being interesting in their own
right, these dynamics can be discretized to recover two numerical algorithms: a gradient descent-type one in the spirit of Merchav and Sabach’s (2023) Bi-SG algorithm (without momentum) and an accelerated Nesterov-type one in the spirit of Merchav, Sabach and Teboulle’s (2024) FBi-PG algorithm (with momentum). The continuous-time analysis serves as a guide for the analysis of the numerical algorithms, and we obtain new state-of-the-art little-o convergence rates, as well as weak convergence of the iterates.

Problem 2: we minimize an outer function, which again has a composite smooth+nonsmooth structure, over the set of zeros of a monotone and continuous operator. Attached to this problem, we once again formulate two dynamical systems: a first-order one in the spirit of gradient flow (with an extra correction term) and a second-order one in the spirit of the fast OGDA dynamics studied by Bot, Csetnek and Nguyen (2022). The numerical counterpart to these dynamics is a work in progress.

Radu Ioan Bot: Strong convergence and fast residual decay for monotone
operator flows via Tikhonov regularization

In the framework of real Hilbert spaces, we investigate first-order dynamical systems governed by monotone and continuous operators. It has been established that for these systems, only the ergodic trajectory converges to a zero of the operator. However, trajectory convergence is assured for operators with the stronger property of cocoercivity. For this class of operators, the trajectory’s velocity and the opertor values along the trajectory converge in norm to zero at a rate of $o(1/ \sqrt{t})$ as $t \to +\infty$.

In this talk, we show that augmenting a monotone operator flow with a Tikhonov regularization term ensures not only strong convergence of the trajectory to the minimal-norm element of the zero set, but also enables the derivation of explicit convergence rates. In particular, we establish norm rates for the trajectory’s velocity and for the residual of the operator along the trajectory, expressed in terms of the regularization function. In some particular cases, these rates can be as fast as $O(1/t)$ as $t \rightarrow +\infty$. In this way, we emphasize a surprising acceleration feature of the Tikhonov regularization. Additionally, we explore these properties for monotone operator flows that incorporate time rescaling and an anchor point. For a specific choice of the Tikhonov regularization function, these flows are closely linked

to second-order dynamical systems with a vanishing damping term. The convergence and convergence rate results we achieve for these systems complement recent findings for the Fast Optimistic Gradient Descent Ascent (OGDA) dynamics.

Finally, derive via an explicit discretization of the Tikhonov regularized monotone flow a novel Extra-Gradient method with anchor term governed by general parameters. We establish strong convergence to specific points within the solution set, as well as convergence rates expressed in terms of the regularization parameters. Notably, our approach recovers the fast residual decay rate $O(1/k)$ as $k \rightarrow +\infty$ for standard parameter choices.