Convergence rate estimates are derived for some subgradient methods for the problem of minimization of a nonsmooth convex Lipschitz homogeneous functional with relative accuracy with respect to the objective functional under functional constraints. It is proposed to apply analogs of known switching subgradient schemes to such problems, which allows us to consider some classes of nonconvex problems as well. A convergence rate estimate is obtained for the adaptive mirror descent with switchings on the class of weakly α-quasiconvex objective functionals and constraint functionals. A convergence rate estimate of a proposed subgradient method with switchings with relative accuracy with respect to the objective functional is proved for problems of minimization of a convex homogeneous objective functional with a weakly $\alpha$-quasiconvex constraint functional. We also consider a method for problems of minimization of a convex homogeneous Lipschitz functional with unimodal Lipschitz constraint functional and derive an estimate for its convergence rate. All convergence rate estimates proved in the paper show the optimality of the proposed algorithmic procedures from the viewpoint of the theory of lower oracle bounds.
Keywords: relative accuracy, convex homogeneous functional, weakly $\alpha$-quasiconvex functional, mirror descent, Lipschitz-continuous functional, unimodal functional
Received June 9, 2020
Revised August 14, 2020
Accepted August 24, 2020
Funding Agency: The research of F. Stonyakin in Algorithm 1 and Theorems 1, 2 and 3 and the research of I. Baran in Algorithm 2 was supported by the grant of the President of the Russian Federation for young Russian candidates of sciences, code MK-15.2020.1.
Fedor Sergeevich Stonyakin, Cand. Sci. (Phys.-Math.), V.I. Vernadsky Crimean Federal University, Simferopol, Republic of Crimea, 295007 Russia, e-mail: fedyor@mail.ru
Inna Viktorovna Baran, Cand. Sci. (Phys.-Math.), V.I. Vernadsky Crimean Federal University, Simferopol, Republic of Crimea, 295007 Russia, e-mail: matemain@mail.ru
REFERENCES
1. Nesterov Yu. Rounding of convex sets and efficient gradient methods for linear programming problems. Optimization Methods and Software, 2008, vol. 23, no. 1, pp. 109–128.
2. Nesterov Yu. Unconstrained convex minimization in relative scale. Math. Oper. Res., 2009, vol. 34, no. 1, pp. 180–193. doi: 10.1287/moor.1080.0348
3. Nesterov Yu. E. Algoritmicheskaya vypuklaya optimizatsiya [Algorithmic convex optimization]. Dr. Phys.-Math. Sci. Dissertation. Moscow: Publ. Mosk. Phys.-tech. Inst. (State University), 2013. 367‘p.
4. Nemirovskii A., Yudin D. Problem complexity and method efficiency in optimization. N Y: J. Wiley & Sons., 1983.388 p. Original Russian text published in Nemirovskii A., Yudin D. Slozhnost’ zadach i effektivnost’ metodov optimizatsii, Moscow: Nauka Publ., 384 p.
5. Polyak B. T. A general method of solving extremum problems. Sov. Math. Dokl., 1967, vol. 8, no. 3, pp. 593—597.
6. Shor N. Z. Application of generalized gradient descent in block programming. Kibernetika, 1967, no. 3, pp. 53—55 (in Russian).
7. Nemirovskii A., Yudin D. Efficient methods for large-scale convex optimization problems. Ekonomika Mat. Metody, 1979, no. 2, pp. 135—152 (in Russian).
8. Bayandina A., Dvurechensky P., Gasnikov A., Stonyakin F., Titov A. Mirror descentand convex optimization problems with non-smooth inequality constraints, In: Large-Scale and Distributed Optimization, eds. Pontus GiselssonAnders Rantzer, Springer: Cham., 2018, Ser. Lect. Notes Math., vol. 2227, Springer, Cham., 2018, pp. 181-213. doi: 10.1007/978-3-319-97478-1_8
9. Stonyakin F., Stepanov A., Gasnikov A., Titov A. Mirror descent for constrained optimization problems with large subgradient values of functional constraints. Computer Research and Modeling, 2020, vol. 12, no. 2, pp. 301-–317. doi: 10.20537/2076-7633-2020-12-2-301-317 .
10. Stonyakin F., Alkousa M., Stepanov A., Titov A. Adaptive mirror descent algorithms for convex and strongly convex optimization problems with functional constraints. J. Appl. Industr. Math., 2019, vol. 13, no. 3, pp. 557-–574. doi: 10.33048/daio.2019.26.636
11. Gasnikov A. V. Sovremennye chislennye metody optimizatsii. Metod universal’nogo gradientnogo spuska [Modern numerical optimization methods. The universal gradient descent method]. Moscow: MIPT Publ., 2018, 286 p.
12. Guminov S., Nesterov Yu., Dvurechensky P., Gasnikov А. Primal-dual accelerated gradient methods with small-dimensional relaxation oracle. Reports of the Academy of Sciences, 2019, vol. 485, no. 1, pp. 15–18 (in Russian). doi: 10.31857/S0869-5652485115-18
13. Hinder O., Sidford A., Sohoni N. S. Near-optimal methods for minimizing star-convex functions and beyond [e-resource]. 2019. 37 p. Available at https://arxiv.org/pdf/1906.11985.pdf
14. Nemirovski A. Lectures on modern convex optimization analysis, algorithms, and engineering applications. Philadelphia: SIAM, 2013. 647 p.
15. Nesterov Yu. Jeffektivnye metody nelinejnogo programmirovanija, [Effective methods in nonlinear programming]. Moscow, Radio and Communication Publ., 1989. 301 p.
Cite this article as: Cite this article as: F.S. Stonyakin, I.V. Baran, On some algorithms for constrained optimization problems with relative accuracy with respect to the objective functional, Trudy Instituta Matematiki i Mekhaniki URO RAN, 2020, vol. 26, no. 3, pp. 198–210.