On some extremal problems for mutual information and entropy


Cite item

Full Text

Open Access Open Access
Restricted Access Access granted
Restricted Access Subscription Access

Abstract

The problem of determining the maximum mutual information I(X; Y) and minimum entropy H(X, Y) of a pair of discrete random variables X and Y is considered under the condition that the probability distribution of X is fixed and the error probability Pr{Y ≠ X} takes a given value ε, 0 ≤ ε ≤ 1. Precise values for these quantities are found, which in several cases allows us to obtain explicit formulas for both the maximum information and minimum entropy in terms of the probability distribution of X and the parameter ε.

About the authors

V. V. Prelov

Kharkevich Institute for Information Transmission Problems

Author for correspondence.
Email: prelov@iitp.ru
Russian Federation, Moscow

Supplementary files

Supplementary Files
Action
1. JATS XML

Copyright (c) 2016 Pleiades Publishing, Inc.