User Tools

Site Tools


cm:entropy_rate

Def: Let <latex> \mathcal{P} = … X_{-1} X_0 X_1 …</latex> be a discrete time stationary process over a finite alphabet of symbols <latex> \mathcal{A} </latex>. The process's (per symbol) entropy rate <latex> h_{\mu} </latex> is defined as the asymptotic limit of the length L block entropy <latex> H[L] </latex> to L. <latex> h_{\mu} \equiv \lim_{L \to \infty} H[L]/L </latex>.

An alternative formulation is the following. Let <latex> h_{\mu}(L) \equiv H[\overrightarrow{X}^L] - H[\overrightarrow{X}^{L-1}] = H[\overrightarrow{X}^L|\overrightarrow{X}^{L-1}] = H[X_L|\overrightarrow{X}^{L-1}] </latex>. Then <latex> h_{\mu}(L) </latex> is a monotonically decreasing sequence (by stationarity), and it can be shown that <latex> h_{\mu} = \lim_{L \to \infty} h_{\mu}(L) </latex>. Thus <latex> h_{\mu} </latex> can be thought of (somewhat informally) as the uncertainty in the next symbol in the process given the infinite past <latex> h_{\mu} = H[X_1| \overleftarrow{X}] </latex>.

For a process generated by an epsilon machine M with a finite number of states <latex> R = \{r_1, … , r_N \} </latex> one can show that <latex> h_{\mu} </latex> is given by a weighted average of the uncertainty in the next symbol generated from each of the machine states <latex> h_{\mu} = \sum_k Pr(r_k)\cdot H[X_{next}|r_k] </latex>.

* Note: Perhaps we want to change notation for epsilon-machine states. There was nothing up in the notation section. This is what I have been using, but I don't think it's standard. *

cm/entropy_rate.txt · Last modified: 2009/12/14 17:27 by jreichar