F.S. Stonyakin, M.S. Alkousa, A.N. Stepanov, M.A. Barinov. Adaptive mirror descent algorithms in convex programming problems with Lipschitz constraints ... P. 266-279

The paper is devoted to new modifications of recently proposed adaptive Mirror Descent methods for convex minimization problems in the case of several convex functional constraints. Methods for problems of two types are considered. In problems of the first type, the objective functional is Lipschitz (generally, nonsmooth). In problems of the second type, the gradient of the objective functional is Lipschitz. We also consider the case of a nonsmooth objective functional equal to the maximum of smooth functionals with Lipschitz gradient. In all the cases, the functional constraints are assumed to be Lipschitz and, generally, nonsmooth. The proposed modifications make it possible to reduce the running time of the algorithm due to skipping some of the functional constraints at nonproductive steps. We derive bounds for the convergence rate, which show that the methods under consideration are optimal from the viewpoint of lower oracle estimates. The results of numerical experiments illustrating the advantages of the proposed procedure for some examples are presented.

Keywords: adaptive Mirror Descent, Lipschitz functional, Lipschitz gradient, productive step, nonproductive step.

The paper was received by the Editorial Office on March 30, 2018.

Funding Agency:

1) Russian Foundation for Basic Research (Grant Number 18-31-00219);

2) Ministry of Education and Science of the Russian Federation ((Grant Number МК-176.2017.1).

Fedor Sergeevich Stonyakin, Cand. Sci. (Phys.-Math.), V.I. Vernadsky Crimean Federal University, Simferopol, Republic of Crimea, 295007 Russia, e-mail: fedyor@mail.ru.

Mohammad S. Alkousa, doctoral student, Moscow Institute of Physics and Technologies, Moscow, 141701 Russia, e-mail: mohammad.alkousa@phystech.edu.

Aleksei Nikolaevich Stepanov, co-executor of research work by grant the President of Russian Federation for young candidates of sciences, project no. MK-176.2017.1, V.I. Vernadsky Crimean Federal University, Simferopol, Republic of Crimea, 295007 Russia, e-mail: stepanov.student@gmail.com.

Maksim Aleksandrovich Barinov, undergraduate student, V.I. Vernadsky Crimean Federal University, Simferopol, Republic of Crimea, 295007 Russia, e-mail: ice8scream@gmail.com.


1.   Ben-Tal A., Nemirovski A. Robust truss topology design via semidefinite programming. SIAM J. Optim., 1997, vol. 7, no. 4, pp. 991–1016. doi: 10.1137/S1052623495291951 .

2.   Shpirko S., Nesterov Yu. Primal-dual subgradient methods for huge-scale linear conic problem. SIAM J. Optim., 2014, vol. 24, no. 3, pp. 1444–1457. doi: 10.1137/130929345 .

3.   Beck A., Teboulle M. Mirror descent and nonlinear projected subgradient methods for convex optimization. Operations Research Letters, 2003, vol. 31, no. 3, pp. 167–175. doi: 10.1016/S0167-6377(02)00231-6 .

4.   Nemirovsky A., Yudin D. Problem complexity and method efficiency in optimization. N Y: J. Wiley & Sons, 1983, 404 p. ISBN: 978-0471103455 .

5.   Polyak B. A general method of solving extremum problems. Soviet Math. Dokl., 1967, vol. 8, no. 3, pp. 593–597 (in Russian).

6.   Shor N.Z. Generalized gradient descent with application to block programming. Cybernetics, 1967, vol. 3, no. 3, pp. 43–45 (in Russian).

7.   Nemirovskii A. Efficient methods for solving convex programming problems of high dimension. Ekonomika i Matematicheskie Metody, 1979, vol. 15, no. 1 (in Russian).

8.   Beck A., Ben-Tal A., Guttmann-Beck N., Tetruashvili L. The comirror algorithm for solving nonsmooth constrained convex problems. Operations Research Letters, 2010, vol. 38, no. 6, pp. 493–498. doi: 10.1016/j.orl.2010.08.005 .

9.   Ben-Tal A., Nemirovski A. Lectures on Modern Convex Optimization. Philadelphia: Society for Industrial and Appl. Math., 2001, 488 p. ISBN: 0-89871-491-5 .

10.   Bayandina A., Dvurechensky P., Gasnikov A., Stonyakin F., Titov A. Mirror descent and convex optimization problems with non-smooth inequality constraints [e-resource]. 2018. 30 p. At available: https://arxiv.org/abs/1710.06612 .

11.   Nesterov Y. Introductory lectures on convex optimization: a basic course. Boston: Kluwer Acad. Publ., 2004, 236 p. doi: 10.1007/978-1-4419-8853-9 .

12.   Nesterov Y. Subgradient methods for convex functions with nonstandard growth properties [e-resource, slides]. At available: https://www.mathnet.ru:8080/PresentFiles/16179/growthbm_nesterov.pdf [Online; accessed 15-April-2018] .

13.   CPython [site]. At available: https://www.python.org/ .