S.S. Ablaev, F.S. Stonyakin, M.S. Alkousa, A.V. Gasnikov. Adaptive subgradient methods for mathematical programming problems with quasi-convex functions ... P. 7-25

The paper is devoted to subgradient methods with switching over productive and non-productive steps for problems of minimization of quasi-convex functions with functional inequality constraints. For the problem of minimizing a convex function with quasi-convex inequality constraints, result is obtained on the convergence of the subgradient method with an adaptive stopping rule. Further, on the basis of an analogue of a sharp minimum for nonlinear problems with inequality constraints, results are obtained on the convergence with the rate of a geometric progression of restarted versions of subgradient methods. Such results are considered separately in the case of a convex objective function and quasi-convex inequality constraints, as well as in the case of a quasi-convex objective function and convex inequality constraints. The convexity may allow to additionally suggest adaptive stopping rules for auxiliary methods, which guarantee that an acceptable solution quality is achieved. The results of computational experiments are presented, showing the advantages of using such stopping rules.

Keywords: subgradient method, quasi-convex function, sharp minimum, restarts, adaptive method

Received May 14, 2023

Revised July 4, 2023

Accepted July 10, 2023

Funding Agency: The research of F.S.Stonyakin in Sections 2 and 5 was supported by the Strategic Academic Leadership Program “Priority 2030” (agreement no. 075-02-2021-1316 of September 30, 2021). The research of A.V.Gasnikov in Section 4 was carried out under a state task of the Ministry of Science and Higher Education of the Russian Federation (project no. 0714-2020-0005).

Seydamet Serverovich Ablaev, doctoral student, V.I.Vernadsky Crimean Federal University, Simferopol, Republic of Crimea, Russia 295007, e-mail: seydamet.ablaev@yandex.ru

Fedor Sergeevich Stonyakin, Dr. Phys.-Math. Sci., Prof., Moscow Institute of Physics and Technologies, Dolgoprudny, Moscow Region, 141701 Russia; V.I.Vernadsky Crimean Federal University, Simferopol, Republic of Crimea, 295007 Russia, e-mail: fedyor@mail.ru

Mohammad S. Alkousa, Cand. Phys.-Math. Sci., Moscow Institute of Physics and Technology (National Research University); Dolgoprudny, Moscow Region, 141701 Russia, e-mail: mohammad.alkousa@phystech.edu

Aleksandr Vladimirovich Gasnikov, Dr. Phys.-Math. Sci., Prof., Moscow Institute of Physics and Technology (National Research University); Dolgoprudny, Moscow Region, 141701 Russia; Caucasus Mathematical Center, Adyghe State University, Maikop, 385000 Russia; Institute for Information Transmission Problems of the Russian Academy of Sciences (Kharkevich Institute), Moscow, 127051 Russia, e-mail: gasnikov@yandex.ru

REFERENCES

1.   Polyak B.T. A general method of solving extremum problems. Sov. Math. Dokl., 1967, vol. 8, no. 3, pp. 593–597.

2.   Huang Y., Lin Q. Single-loop switching subgradient methods for non-smooth weakly convex optimization with non-smooth convex constraints. 2023, 49 p. Available at https://arxiv.org/pdf/2301.13314.pdf 

3.   Bayandina A., Dvurechensky P., Gasnikov A., Stonyakin F., Titov A. Mirror descent and convex optimization problems with non-smooth inequality constraints. In: Large-Scale and Distributed Optimization, eds. Pontus Giselsson and Anders Rantzer, 2018, Ser. Lecture Notes in Math., vol. 2227, pp. 181–213. doi: 10.1007/978-3-319-97478-1_8

4.   Lagae S. New efficient techniques to solve sparse structured linear systems, with applications to truss topology optimization. In: Ecole polytechnique de Louvain, Université catholique de Louvain, 2017. Available at https://dial.uclouvain.be/memoire/ucl/object/thesis:12934 .

5.   Nesterov Yu. Subgradient methods for huge-scale optimization problems. Math. Prog., 2015, vol. 146, no. 1–2, pp. 275–297. doi: /10.1007/s10107-013-0686-4

6.   Stonyakin F.S., Alkousa M.S., Stepanov A.N., Barinov M.A. Adaptive mirror descent algorithms in convex programming problems with Lipschitz constraints. Trudy Instituta Matematiki i Mekhaniki UrO RAN, 2018, vol.. 24, no. 2, pp. 266–279 (in Russian). doi: 10.21538/0134-4889-2018-24-2-266-279

7.   Stonyakin F.S., Stepanov A.N., Gasnikov A.V., Titov A.A. Mirror descent for constrained optimization problems with large subgradient values of functional constraints. Computer Research and Modeling, 2020, vol. 12, no. 2, pp. 301–317. doi: 10.20537/2076-7633-2020-12-2-301-317

8.   Lin Q., Ma R., Nadarajah S., Soheili N. A Parameter-free and projection-free restarting level set method for adaptive constrained conver optimization under the error bound condition, 2022, 29 p. Available at https://arxiv.org/pdf/2010.15267.pdf .

9.   Polyak B.T. Vvedenie v optimizaciyu [Introduction to Optimization]. Moscow: Nauka Publ., 1983, 384 p.

10.   Nesterov Yu. Effektivnye metody nelinejnogo programmirovaniya [Effective methods of nonlinear programming]. Moscow: Radio and Communication Publ., 1989, 301 p.

11.   Clarke F. Optimization and nonsmooth analysis. NY: Wiley-Interscience, 1983, 308 p. Translated to Russian under the title Optimizatsiya i negladkii analiz, Moscow: Nauka Publ, 1988. 280 p.

12.   Polyak B.T. Minimization of nonsmooth functionals. Comp. Math. and Math. Phys., 1969, vol. 9, no. 3, pp. 509–521 (in Russian).

13.   Boyd S., Vandenberghe L. Convex optimization. Cambridge: Cambridge Univ. Press, 2004, 716 p. ISBN: 9780521833783

Cite this article as: S.S. Ablaev, F.S. Stonyakin, M.S. Alkousa, A.V. Gasnikov. Adaptive subgradient methods for mathematical programming problems with quasi-convex functions. Trudy Instituta Matematiki i Mekhaniki UrO RAN, 2023, vol. 29, no. 3, pp. 7–25. Proceedings of the Steklov Institute of Mathematics (Suppl.), 2023, Vol. 323, Suppl. 1, pp. S1–S18.