# Hardware random number generator

In computing, a hardware random number generator is an apparatus that generates random numbers from a physical process. Such devices are typically based on microscopic phenomena such as thermal noise or the photoelectric effect or other quantum phenomena. These processes are, in theory, completely unpredictable, and the theory's assertions of unpredictability are subject to experimental test. A quantum-based hardware random number generator typically contains an amplifier to bring the output of the physical process into the macroscopic realm, and a transducer to convert the output into a digital signal.

Random numbers generators can also be obtained from macroscopic phenomena, such as cards, dice, and the roulette wheel. This unpredictability can be justified by the theory of unstable dynamical systems and chaos theory. These theories suggest that even though macroscopic phenomena are deterministic in theory under Newtonian mechanics, real-world systems evolve in ways that cannot be predicted in practice because one would need to know the initial conditions to an accuracy that grows exponentially over time.

Although dice are mostly used for gambling, the Victorian scientist Francis Galton described a way to use dice to generate random numbers for scientific purposes in 1890. Even though some gamblers believe they can control their throws of dice enough to win at craps (a claim which remains a topic of debate), no one (outside the paranormal community) claims that someone not party to the throwing can predict the outcomes.

Hardware random number generators may be relatively slow, and they may produce a biased sequence (that is, some numbers are more common than others). Whether a hardware random number generator is suitable for a particular application depends on the application.

## Contrast with pseudo-random number generators

Most computer "random number generators" are not hardware devices, but are software algorithms. These are more properly called pseudo-random number generators, and cannot produce truly random outputs. Algorithmic information theory defines a sequence of bits as random if and only if it is shorter than any computer program that can produce that string (Chaitin-Kolmogorov randomness). Pseudo-random number generators fail that test dramatically. They can usually be programmed in a few thousand bits, but can produce sequences far longer.

There are several informal definitions of randomness, usually based on either a lack of discernible patterns, or their unpredictability. Output from well-designed pseudo-random number generators may pass assorted statistical tests probing for non-randomness (see NIST Special Publication 800-22, Knuth, The Art of Computer Programming, vol. 2, and RFC 1750 for details of such tests). The sequences they produce always have a pattern, in one sense, since the algorithm that generates them has a finite state, and must eventually repeat one of those states. Given the original state of the generator, and its algorithm, a pseudo-random number generator is totally predictable, and given even partial knowledge of that state, they may be insecure for many purposes. On the other hand, it is easy to produce pseudo-random number generators that are guaranteed not to repeat on any conceivable computer within a time-frame that is millions of times longer than the age of the Universe. It is an open question whether it is possible in any practical way to distinguish the output of a well designed pseudo-random number generator from a perfectly random source without knowledge of the generator's internal state.

## Uses of "random" numbers

Main article: Applications of randomness

"Unpredictable" random numbers were first investigated in the context of gambling, and many randomizing devices such as dice, shuffling playing cards, and roulette wheels, were first developed for use in gambling. Fairly produced random numbers are vital to electronic gambling and ways of creating them are sometimes regulated by governmental gaming commissions.

"Random" numbers are also used for non-gambling purposes, both where their use is mathematically important, such as sampling for opinion polls, and in situations where "fairness" is approximated by randomization, such as selecting jurors and military draft lotteries. In the Book of Numbers (33:54), Moses commands the Israelites to apportion the land by lot (גורל).

## Early attempts to generate true random numbers

One early way of producing random numbers was by a variation of the same machines used to play keno or select lottery numbers. Basically, these mixed numbered ping-pong balls with blown air, and used some method to withdraw balls from the mixing chamber. This method gives reasonable results, but the random numbers generated by this means are expensive. The method is inherently slow, and is unusable in most mechanized situations.

In 1927, Cambridge University Press published a book of Random sampling numbers, arranged by a statistician, Leonard Henry Caleb Tippett, which contained 41,600 digits taken from the areas of English parishes listed in census records. Other random number tables were published including one by R. A. Fischer and another by the U.S. interstate Commerce Commission in 1949 with over 100,000 random digits.

The culminate printed work was A Million Random Digits with 100,000 Normal Deviates, published by the RAND Corporation in 1955. They created an electronic simulation of a roulette wheel attached to a key punch, the results of which were then carefully filtered and tested before being used to generate the table. The RAND table was a significant breakthrough in delivering random numbers because such a large and carefully prepared table had never before been available. It was useful for simulations and modeling. But, having been published, it is not usable as cryptographic keys, or as a seed in some cryptographic pseudo-random number generator. However, since it has been published already, picking its values as the random constants for initializing algorithms would demonstrate that the constants were not selected for (in B. Schneier's words) "nefarious purpose(es)." Khufu and Khafre do this, for example (ref Applied Cryptography - B. Schneier). See: Nothing up my sleeve numbers.

## Physical phenomena used for random number generation

Many modern random number generators attempt to use some form of quantum-mechanical noise, widely considered to be the gold standard for randomness. (For a discussion of empirical verification of quantum unpredictability, see Bell test experiments). Some phenomena used include:

• a nuclear decay radiation source (as, for instance, from some kinds of commercial smoke-alarms), detected by a Geiger counter attached to a PC.
• atmospheric noise, detected by a radio receiver attached to a PC
• thermal or quantum-mechanical noise, amplified to provide a random voltage source. A favored source of noise is avalanche noise generated from a reverse-biased zener diode. The thermal noise from a resistor can also be used.

One approach is to convert the noise source into random bits in a separate device that is then connected to the computer through an I/O port. The acquired noise signal is amplified, filtered, and then run through a high-speed voltage comparator to produce a logic signal that alternates states at random intervals. Care must be taken when amplifying low-level noise to keep out spurious signals, such as power line hum and broadcast transmissions. In some simple designs, this logic value is converted to an RS-232 level signal and sent directly to a computer's serial port. Software then sees this series of logic values as bursts of "line noise" characters on an I/O port. More sophisticated systems may format the bit values before passing them into a computer.

Another approach is to feed an analog noise signal to an analog to digital converter, such as the existing audio input port available on most personal computers. The digitized signal may then be processed further in software to remove any bias.

Some have suggested using digital cameras, such as webcams, to photograph chaotic macroscopic phenomena. A group at Silicon Graphics imaged Lava lamps to generate random numbers. Template:US patent One problem was determining whether the chaotic shapes generated were random -- the team decided that they are in properly operating Lava lamps. Other chaotic scenes could be employed, such as streamers blown by a computer's exhaust fan or bubbles in a fish tank (fish optional). The digitized image will generally contain additional noise resulting from the video to digital conversion process.

One commercial product, Quantis from id Quantique SA, exploits an elementary quantum optics process, sending photons one by one onto a semi-transparent mirror. The mutually exclusive events (reflection - transmission) are detected and associated to "0" - "1" bit values.

Perhaps the most common approach is to use precise timing of the interrupts caused by mechanical input/output devices, such as keyboards and disk drives as a source of randomness. Done carefully (as in, for example, the Yarrow algorithm), enough entropy can be collected for the occasional creation of cryptographic keys and nonces.

## Dealing with bias

The bit-stream from such systems is prone to be biased, with 1s or 0s predominating. There are two approaches to dealing with bias and other artifacts. The first is to engineer the RNG to minimize bias. One method to correct this feeds back the generated bit stream, filtered by a low-pass filter, to adjust the bias of the generator. By the central limit theorem, the feedback loop will tend to be well-adjusted almost all the time. Ultra-high speed random number generators use this method. Even then, the numbers generated are usually somewhat biased.

A higher quality device might use two sources and eliminate signals that are common to both—this eliminates interference from outside electric and magnetic fields. This is recommended for gambling devices, to prevent cheating by exploiting bias in the "random bit" stream.

The Intel RNG (supplied in some of their board-level chipsets for PC-type computers and so perhaps the most widely available), uses most of the above tricks and adds another. The non-common-mode noise from two diodes controls a voltage-controlled oscillator, which clocks bits from a rapid oscillator (the new trick), which then go through a von Neumann type decorrelation step.

A related method uses two uncoupled oscillators, and counts events in one from the time-base of another. This method has been used on PCs to generate software/hardware "true-random" random numbers. It requires a PC with two clock crystals, one for the real-time clock, and another for the processor. The program loops, counting the time that one of the bits of the counter of the real-time clock is a 1. The least significant bit of the loop-counter can be quite 'random'; depending on the implementational details (and on the current condition / operation of the hardware) can even be random "enough" for some uses. A variation of this method is implemented in hardware in some versions of the VIA C3 microprocessor, and software can access the random bits using special machine language instructions.

## Attempts to clean up non-random bit-streams

A second approach to bias is to remove it in software. Even if the above hardware bias reduction steps have been taken, the bit-stream should still be assumed to contain bias and correlation. There are various methods used to try to reduce bias and correlation, often known by the name "whitening" algorithms, by analogy with the related problem of producing white noise from a correlated signal.

John von Neumann invented a simple algorithm to fix simple bias, and reduce correlation: it considers bits two at a time, taking one of three actions: when two successive bits are the same, they are not used as a random bit, a sequence of 1,0 becomes a 1, and a sequence of 0,1 becomes a zero. This eliminates simple bias, and is easy to implement as a computer program or in digital logic. This technique works no matter how the bits have been generated. It cannot assure randomness in its output, however. What it can do is (with significant loss) transform a random stream with a frequency of 1's different from 50% into a stream with that frequency, which is useful with some physical sources. When the random stream has a 50% frequency of 1's to begin with, it reduces the bit rate available by a factor of four, on average.

Another method for improving a near random bit stream is to exclusive-or the bit stream with the output of a high-quality cryptographically secure pseudo-random number generator such as Blum Blum Shub or a good stream cipher. This can cheaply improve decorrelation and digit bias.

A very simple method to improve a near random bit stream is to take two or more uncorrelated near random bit streams, and exclusive or them together. Let the probability of a bit stream producing a 1 be 1/2 + e, where -1/2 < e < 1/2. Then e is the bias of the bitstream. If two bit uncorrelated bit streams with bias e are exclusive-or-ed together, then the bias of the result will be 2e2. This may be repeated with more bitstreams. If e is small, then a very small bias is rapidly achieved. (See also Piling-up lemma.)

Some designs apply cryptographic hash functions such as MD5, SHA-1, or RIPEMD-160 or even a CRC function to all or part of the bit stream, and then use the output as the random bit stream. This is attractive, partly because it is relatively fast compared to some other methods, but depends entirely on qualities in the hash output for which there is little or no theoretical basis. In addition, recent (2004) research results have demonstrated reduced effort attacks against many of the most widely used hash algorithms.

Other designs use "true random bits" as the key for a high quality block cipher algorithm, taking the encrypted output as the random bit stream. Care must be used in these cases to select an appropriate block mode, however.

When a high-speed source of lower-quality random digits is needed, sometimes the true random source is used to generate the seed for a pseudo random number generator. The PRNG is run for a fixed number of digits, while the hardware generating device produces a new seed.

## Estimating the size of the entropy pool

There are mathematical techniques for estimating the entropy of a sequence of symbols. These are useful for determining if there is enough entropy in a seed pool, for example, but they cannot, in general, distinguish between a true random source and a pseudo-random generator.

## Software implementation of random number generators from observed events

Software engineers without true random number generators often try to develop them by measuring physical events available to the software. An exemplar is measuring the time between user key-strokes, and then taking the least significant bit (or two or three) of the count as a random digit. A similar approach has been applied by measuring task-scheduling, network hits, disk-head seek times and other internal events.

The method is quite risky when it uses computer-controlled events because a clever, malicious programmer might be able to predict a cryptographic key by controlling the external events. Several gambling frauds have been uncovered which rely on manipulating (normally hidden) events internal to the operation of computers or networks. It is also risky because the supposed user-generated event (e.g., keystrokes) can be spoofed, allowing an attacker to control the "random values" used by the cryptography.

However, with sufficient care, a system can be designed that produces cryptographically secure random numbers from the sources of randomness available in a modern computer. The basic design is to maintain an "entropy pool" of random bits that are assumed to be unknown to an attacker. New randomness is added whenever available (for example, when the user hits a key) and an estimate of the number of bits in the pool that cannot be known to an attacker is kept. At this point there are several strategies:

• When random bits are requested, return that many bits derived from the entropy pool (by a cryptographic hash function, say) and decrement the estimate of the number of unknown bits. If not enough unknown bits are available, wait until enough are available. This is the design of the "/dev/random" device in Linux, and it provides high-quality random numbers as long as the estimates of the input randomness are sufficiently cautious. The Linux "/dev/urandom" device is a simple modification that returns pseudo-random numbers if there is not enough entropy.
• Maintain a stream cipher with a key and IV obtained from an entropy pool. When enough bits of entropy have been collected, replace both key and IV with new random values and decrease the estimated entropy in the pool. This is the approach taken by the `yarrow` library. It provides resistance against certain attacks and conserves the hard-to-obtain entropy.

There is a Perl module called Math::TrulyRandom that generates random numbers from interrupt timing discrepancies.

## Problems

It is very easy to misconstruct devices that generate random numbers. Also, they break silently, often producing decreasingly random numbers as they degrade. An example might be the rapidly decreasing radioactivity of the smoke alarms mentioned earlier. As the radioactive intensity decreases, its sensor will be required to compensate, not an easily accomplished task. Failure modes in such devices are plentiful and are neither easy nor quick nor cheap to detect.

Because they are quite fragile, and fail silently, statistical tests on their output should be performed continuously. Many such devices include some such tests into the software that reads the device. Others, of course, don't.

Just as with other components of a cryptosystem, a software random number generator should be designed to resist certain attacks. See: random number generator attack. Defending against these attacks is difficult.

## Checking the performance of hardware random number generators

Hardware random number generators should be constantly monitored for proper operation. RFC 1750 and FIPS Pub 140-2 include such tests. Also see the documentation for the New Zealand cryptographic software library cryptlib. Statistical tests can detect failure of a noise source, or a radio station transmitting on a channel thought to be empty, for example. Ideally data should be sampled for testing before it is passed through a "whitener." This allows better verification of the underlying noise generation. Detecting some deviation from perfection would be a sign that a true noise source is being tested. Correlation of bias with other parameters such as internal temperature of bus voltage would be a further check.