Chapter 9 | Chapter 11 | |||||||

## index | ||||||||

## Back to Deley's Homepage |

The field of Information Theory is where we find a mathematical understanding of what a random number generator really is. Our ideal random number generator is a zero-memory information source with maximum entropy. We think of a source as emitting a sequence of symbols from a fixed finite set S = {s1, s2, ... sn}. Successive symbols are selected according to some fixed probability law. For a zero-memory source, the source remembers nothing about what previous symbols have been emitted; each successive symbol emitted from the source is completely independent of all previous output symbols. Maximum entropy indicates that each symbol in the set S is equally likely to occur. Any other probability distribution would result in a lower entropy, with some symbols being more likely than others. You may recall entropy is also used in thermodynamics to measure the amount or "randomness" in a particular system.

A linear congruential random number generator is an example of a first-order Markov process. The next value depends entirely on the previous value. Even the Data Encryption Standard (DES) is nothing more than a first-order Markov process. A first-order Markov source can emulate a zero-memory source up to a point.

A higher order Markov process has a better chance of emulating a zero-memory process. By shuffling the output, we have converted the first-order Markov process to something like an Mth-order Markov process where M is the size of the shuffling deck we use. In this case M pieces of information about previous outputs are kept instead of just 1 piece of information. Thus, because of the additional memory used, the output can resemble more closely that of a zero-memory source. And indeed it does do a much better job.

For further understanding of zero-memory information sources and Markov information sources consult a good book on Information Theory. The book by Abramson in the references below is an excellent one which also gives an interesting account of how languages such as English, Spanish, French, and German, can be synthesized using a low order Markov process. (See The Structure of Language)

Chapter 9 | Chapter 11 | |||||||

## index | ||||||||

## Back to Deley's Homepage |