# Minimax principle

An optimality principle for a two-person zero-sum game, expressing the tendency of each player to obtain the largest sure pay-off. The minimax principle holds in such a game $\Gamma=\langle A,B,H\rangle$ if the equality
$$v=\max_{a\in A}\min_{b\in B}H(a,b)=\min_{b\in B}\max_{a\in A}H(a,b)\tag{*}$$
holds, that is, if there are a value of the game, equal to $v$, and optimal strategies for both players.
$$H(a,b^*)\leq H(a^*,b^*)\leq H(a^*,b)$$
for all $a\in A$, $b\in B$, where $a^*$ and $b^*$ are the strategies on which the external extrema in \ref{*} are attained. Thus, the minimax principle expresses mathematically the intuitive conception of stability, since it is not profitable for either player to deviate from his optimal strategy $a^*$ (respectively, $b^*$). At the same time the minimax principle guarantees to player I (II) a gain (loss) of not less (not more) than the value of the game. An axiomatic characterization of the minimax principle for matrix games has been given (see ).