next up previous
Next: ch3 Up: ch3 Previous: ch3

Entropy

$Entropy(S)$ = expected number of bits needed to encode class ($\oplus$ or $\ominus$) of randomly drawn member of $S$ (under the optimal, shortest-length code)


Why?

Information theory: optimal length code assigns $- \log_{2}p$ bits to message having probability $p$.


So, expected number of bits to encode $\oplus$ or $\ominus$ of random member of $S$:

\begin{displaymath}p_{\oplus} (-\log_{2} p_{\oplus}) + p_{\ominus} (-\log_{2} p_{\ominus}) \end{displaymath}


\begin{displaymath}Entropy(S) \equiv - p_{\oplus} \log_{2} p_{\oplus} - p_{\ominus} \log_{2}
p_{\ominus} \end{displaymath}



Don Patterson 2001-12-13