Min-entropy



In probability theory or information theory, the min-entropy of a discrete random event x with possible states (or outcomes) 1... n and corresponding probabilities p1... pn is

H_\infty(X) = \min_{i=1}^n (-\log p_i) = -(\max_i \log p_i) = -\log \max_i p_i