Abstract:
Minimax problems with constrained variables are considered. It is shown that under specified assumptions the internal maximum function is differentiable in the sense of Clarke and regular. The method of generalized stochastic gradient is proposed to minimize the function in the presence of constraints. It is shown how the parameters of the method can be made consistent with the convergence of a “diagonal” procedure of the Arrow-Hurewicz type in the case where the internal maximization problem is concave.
.