# A.S. Antipin. Optimization methods for the sensitivity function with constraints ... P. 33-42.

We consider a parametric family of convex programming problems. The parameter is the vector of the right-hand sides in the functional constraints of the problem. Each vector value of the parameter taken from the nonnegative orthant corresponds to a regular (Slater's condition) convex programming problem and the minimum value of its objective function. This value depends on the constraint parameter and generates the sensitivity function. Along with this function, a convex set is given geometrically or functionally. The problem of minimization of the implicit sensitivity function on this set is posed. It can be interpreted as a convex programming problem in which, instead of a given vector of the right-hand sides of functional constraints, only a set to which this vector belongs is specified. As a result, we obtain a two-level problem. In contrast to the classical two-level hierarchical problems with implicitly given constraints, it is objective functions that are given implicitly in out case. There is no hierarchy in this problem. As a rule, sensitivity functions are discussed in the literature in a more general context as functions of the optimal value. The author does not know optimization statements of these problems as independent studies or, even more so, solution methods for them. A new saddle approach to the solution of problems with sensitivity functions is proposed. The monotone convergence of the method is proved with respect to the variables of the space in which the problem is considered.

Keywords: sensitivity function, parametric optimization, parametric Lagrangian, saddle point, extraproximal methods, convergence.

The paper was received by the Editorial Office on June 13, 2017.

Anatolii Sergeevich Antipin, Dr. Phys.-Math. Sci., Prof.,  Federal Research Center ''Computer Science  and Control'' of Russian Academy of  Science,  Moscow, 119333 Russian, e-mail: asantip@yandex.ru

REFERENCES

1.   Antipin A.S., Golikov A.I., Khoroshilova E.V. Sensitivity function: Properties and applications. Comput. Math. Math. Phys., 2011, vol. 51, no. 12, pp. 2000–2016. doi: 10.1134/S0965542511120049 .

2.   Vasil’ev F.P. Metody optimizatsii: T. 1,2. [Optimization methods: Vol. 1,2]. Moscow, MTsNMO Publ., 2011, 1056 p.

3.   Rockafellar R.T. Monotone operators and the proximal point algorithm. SIAM J. Control and Optimization, 1976, vol. 14, no. 5, pp. 877–898. doi: 10.1137/0314056 .

4.   Antipin A.C. On the method of convex programming using the symmetric modification of the Lagrange function. Economy and Mat. Methods, 1976, vol. 12, no. 6, pp. 1164–1173 (in Russian).

5.   Antipin A.S. The convergence of proximal methods to fixed points of extremal mappings and estimates of their rate of convergence. Comput. Math. Math. Phys., 1995, vol. 35, no. 5, pp. 539–551.

6.   Konnov I.V. Nelineinaya optimizatsiya i variatsionnye neravenstva [Nonlinear optimization and variational inequalities]. Kazan: Kazan Federal University Publ., 2013, 508 p. ISBN: 978-5-00019-059-3 .

7.   Antipin A.S. Saddle problem and optimization problem as an integrated system. Proc. Steklov Inst. Math. (Suppl.), 2008, vol. 263 (suppl. 2), pp. 3–14. doi: 10.1134/S0081543808060023 .

8.   Antipin A.S. An extraproximal method for solving equilibrium programming problems and games with coupled variables. Comput. Math. Math. Phys., 2005, vol. 45, no. 12, pp. 2020–2029.

9.   Clempner J., Poznyak A.S. Using the extraproximal method for computing the shortest-path mixed Lyapunov equilibrium in Stackelberg security games. Mathematics and Computers in Simulation, 2017,vol. 138, pp. 14–30. doi: 10.1016/j.matcom.2016.12.010 .

10.   Trejo K.K., Clempner J.B., Poznyak A.S. A stackelberg security game with random strategic based on the extraproximal theoretic approach. Engineering Applications of Artificial Intelligence, 2015,vol. 37, pp. 145–153. doi: 10.1016/j.engappai.2014.09.002 .