## description of the problem

Suppose we have an alphabet $ mathcal {X} $, where for each letter $ X & # 39; in mathcal {X} $ we have a corresponding distribution $ P_ {X & # 39;} in mathcal {P} $.

A letter $ X in mathcal {X} $ is sent through a channel, in which an opponent chooses $ k $ random letters in $ mathcal {X} $ (excluding the true $ X $) to form a whole $ U = {X_1, ldots, X_k } $and send $ N $ random samples, taken i.i.d. of the distribution of the mixture $ bar {P} _U = frac {1} {K} (P_ {X_1} + ldots + P_ {X_k}) $, denoted $ Y_1, ldots, Y_N $ (or $ Y ^ N $ collectively). What is the minimum probability of error that the decoding receiver can reach?

### Target solution

As $ n rightarrow infty $, we would expect the probability of error to be limited by $ frac {1} {| mathcal {X} | – K} $, but I have a hard time proving that this is the case. Ideally, we would find a tight non-asymptotic bound.

### My attempt at proof

It seems like it should be a simple application of Fano's inequality. Given an estimator $ hat {X} $ for $ X $, we have,

$$ P ( hat {X} neq X) geq 1 – frac {I (X; Y ^ N) + 1} {log | mathcal {X} |}, $$

or $ I (X; Y ^ N) $ means mutual information between $ X $ and $ Y ^ N $. I couldn't do better than using $ I (X; Y ^ N) leq nI (X; Y) $and delimit mutual information in terms of maximum KL divergence per pair. Unfortunately this limits the error from zero as $ n $ grows large enough.