Gradient-free proximal methods with inexact oracle for convex stochastic nonsmooth optimization problems on the simplex


Cite item

Full Text

Open Access Open Access
Restricted Access Access granted
Restricted Access Subscription Access

Abstract

In this paper we propose a modification of the mirror descent method for non-smooth stochastic convex optimization problems on the unit simplex. The optimization problems considered differ from the classical ones by availability of function values realizations. Our purpose is to derive the convergence rate of the method proposed and to determine the level of noise that does not significantly affect the convergence rate.

About the authors

A. V. Gasnikov

Moscow Institute of Physics and Technology (State University); Institute for Information Transmission Problems (Kharkevich Institute)

Author for correspondence.
Email: gasnikov@yandex.ru
Russian Federation, Moscow; Moscow

A. A. Lagunovskaya

Moscow Institute of Physics and Technology (State University); Keldysh Institute of Applied Mathematics

Email: gasnikov@yandex.ru
Russian Federation, Moscow; Moscow

I. N. Usmanova

Moscow Institute of Physics and Technology (State University); Institute for Information Transmission Problems (Kharkevich Institute)

Email: gasnikov@yandex.ru
Russian Federation, Moscow; Moscow

F. A. Fedorenko

Moscow Institute of Physics and Technology (State University)

Email: gasnikov@yandex.ru
Russian Federation, Moscow

Supplementary files

Supplementary Files
Action
1. JATS XML

Copyright (c) 2016 Pleiades Publishing, Ltd.