- For other meanings, see Random (disambiguation).
The word random is used to express apparent lack of purpose, cause, or order. The term randomness is often used synonymously with a number of measurable statistical properties, such as lack of bias or correlation. On a completely different note, however, random could also be use to refer to a stream of humor, in culture, the media, and most notably in internet cartoons and flash animations.
Randomness has an important place in science and philosophy.
Mankind has been concerned with randomness since prehistoric times, mostly through divination (reading messages in random patterns) and gambling. The opposition between free will and determinism has been a divisive issue in philosophy and theology.
Despite the prevalence of gambling in all times and cultures, for a long time there was little western inquiry into the subject, possibly due to the Church's disapproval of gambling and divination. Though Gerolamo Cardano and Galileo have written about games of chance, it was work by Blaise Pascal, Pierre de Fermat and Christiaan Huygens that led to what is today known as probability theory.
Mathematicians focused at first on statistical randomness and considered block frequencies (that is, not only the frequencies of occurrences of individual elements, but also those of blocks of arbitrary length) as the measure of randomness, an approach that extended into the use of information entropy in information theory.
In the early 1960s Gregory Chaitin, Andrey Kolmogorov and Ray Solomonoff introduced the notion of algorithmic randomness, in which the randomness of a sequence represents whether it is easy to compress.
In conclusion I have no nose.
Randomness versus unpredictability
Randomness should not be confused with practical unpredictability, which is a related idea in ordinary usage. Some mathematical systems, for example, could be seen as random; however these are considered unpredictable. This is due to sensitive dependence on initial conditions (See chaos theory). Many random phenomena may exhibit organized features at some levels. For example, while the average rate of increase in the human population is quite predictable, in the short term, the actual timing of individual births and deaths cannot be predicted. This small-scale randomness is found in almost all real-world systems. Ohm's law and the kinetic theory of gases are statistically reliable descriptions of the 'sum' (ie, the net result or integration) of vast numbers of individual micro events, each of which are random, and none of which are individually predictable. (Theoretically the micro-events of gases, for example, could be predicted if the exact position, velocity, atomic composition, angular momentum, and so on of each particle were known.) All we directly perceive is circuit noise and some bulk gas behaviors.
It is important to note that chaotic systems are only unpredictable in practice due to their extreme dependence on initial conditions. Whether or not they are unpredictable in terms of computability theory, i.e.,given initial conditions exactly can the result be predicted, seems to be a subject of current research. At least in some disciplines computability theory the notion of randomness turns out to be identified with computational unpredictability.
Unpredictability is required in some applications, such as the many uses of random numbers in cryptography. In other applications (e.g. modeling or simulation) statistical randomness is essential, but predictability is also required (for instance, when repeatedly running simulations or acceptance tests, it can be useful to be able to rerun the model with the exact same random input several times).
Sensibly dealing with randomness is a hard problem in modern science, mathematics, psychology and philosophy. Merely defining it adequately, for the purposes of one discipline has proven quite difficult. Distinguishing between apparent randomness and actual randomness has been no easier. In addition, assuring unpredictability, especially against a well-motivated party (in cryptographic parlance, the "adversary"), has been harder still.
Some philosophers have argued that there is no randomness in the universe, only unpredictability. Others find the distinction meaningless. (See determinism).
Popular perceptions of randomness are frequently wrong, based on logical fallacies. Following is an attempt to identify the source of such fallacies and correct the logical errors. For a more detailed discussion, see Gambler's Fallacy.
A number is "due"
This argument says that "since all numbers will eventually come up in a random selection, those that have not come up yet are 'due' and thus more likely to come up soon". This logic is only correct if applied to a system where numbers that come up are removed from the system, like if cards are drawn and then removed from the deck. It's true that once a jack is removed from the deck, the next draw is less likely to be a jack and more likely to be some other card. However, if the card drawn is returned to the deck, and the deck is reshuffled, there should be an equal chance of drawing a jack or any other card the next time. The same truth applies to any other case where nothing is removed from the system after each event.
A number is "cursed"
This argument is almost the reverse of the above, and says that numbers which have come up less often in the past will continue to come up less often in the future. A similar "number is 'blessed'" argument might be made saying that numbers which have come up more often in the past are likely to do so in the future. This logic is only valid if the roll is somehow biased and results don't have equal probabilities - for example, with a weighted die. If we know for certain that the roll is fair, then previous events have no influence over future events.
Note that in nature, unexpected or uncertain events rarely occur with perfectly equal frequencies, so learning which events are likely to have higher probability by observing outcomes makes sense. What is fallacious is to apply this logic to systems which are specially designed so that all outcomes are equally likely - such as dice, roulette wheels, and so on.
Study of randomness
Many scientific fields are concerned with randomness :
- Algorithmic probability
- Chaos theory
- Game theory
- Information theory
- Pattern recognition
- Probability theory
- Quantum mechanics
- Statistical mechanics
Note that the bias that "everything has a purpose or cause" is actually implicit in the expression "apparent lack of purpose or cause". Humans are always looking for patterns in their experience, and the most basic pattern seems to be cause/effect. This appears to be deeply embedded in the human brain, and perhaps in other animals as well. For example, dogs and cats often have been reported to have apparently made a cause and effect connection that strikes us as amusing or peculiar. (See classical conditioning.) For instance there is a report of a dog who, after a visit to a vet whose clinic had tile floors of a particular kind, refused thereafter to go near such a tiled floor, whether or not it was at a vet's.
It is because of this bias that the absence of a cause seems problematic. See causality.
To solve this 'problem', random events are sometimes said to be caused by chance. Rather than solving the problem of randomness, this opens the gaping hole of defining chance. It is hard to avoid circularity by defining chance in terms of randomness.
The characteristics of an organism are traditionally said to be due to genetics and environment, but there are also random elements. For example, consider the characteristic of freckles on a person's skin. Their genetic inheritance controls their potential for developing freckles, with this gene linked to the gene for red hair, in this case. Their environment, such as solar exposure, determines how many of these potential freckles are actually present. The location of each individual freckle, however, can neither be predicted by genetics or solar exposure, so is caused by a random element. Whether this is truly random, or just follows a pattern too complex for us to understand, is not known.
Note that this effect isn't limited to physical characteristics. Sexual orientation also appears to have a random element, for example. In identical twin studies, such twins are more likely to have the same sexual orientation than two randomly chosen individuals in any given population. This correlation is due solely to genetics, if they are adopted and raised in separate environments, and could be either genetics or environment if they are raised in the same environment. However, those identical twins raised in the same environment still do not have a 100% correlation in sexual orientation. In cases where there is a difference in sexual orientation between the two, this must be attributed to a random element. Again, we don't know if this is truly random, or just follows a pattern too complex for us to understand.
In the natural sciences
Traditionally, randomness takes on an operational meaning in natural science: something is apparently random if its cause cannot be determined or controlled. When an experiment is performed and all the control variables are fixed, the remaining variation is ascribed to uncontrolled (ie, 'random') influences. The assumption, again, is that if it were somehow possible to perfectly control all influences, the result of the experiment would be always the same. Therefore, for most of the history of science, randomness has been interpreted in one way or another as ignorance on the part of the observer.
With the advent of quantum mechanics, however, it appears that the world might be irreducibly random. According to the standard interpretations of the theory, it is possible to set up an experiment with total control of all relevant parameters, which will still have a perfectly random outcome. Minority resistance to this idea takes the form of hidden variable theories in which the outcome of the experiment is determined by certain unobservable characteristics (hence the name "hidden variables"). The debate is over whether truly random events exist, or whether events perceived as random are simply following patterns too complex for our cognition ability.
Many physical processes resulting from quantum-mechanical effects are, therefore, believed to be irreducibly random. The best-known example is the timing of radioactive decay events in radioactive substances.
Deviations from randomness are often regarded by parapsychologists as evidence for the theories of parapsychology.
Random humor has risen to become a popular genre in a majority of internet animations and cartoons, as a familiar behaviour to convey entertainment. The concept is based on humor which not only exploits humorous situations, but also makes use of random words and specific trivia from one's own life experiences to communicate an out-of-the-blue joke or statement. An example might be, “Some guy walked down the street and slipped on a regurgitating dull blue hat.” While it may seem simple, the concept of random humor incorporates many intelligent literary concepts, such as strong personification of otherwise inanimate objects, adjectives which are considered unusual or uncommon, and relationships between ideas and concepts which would often be ignored for lack of intelligence. Random humor is a growing genre of entertainment in internet flash cartoons and animations because of its easy portability - anyone can create their own flavour of random humor, because its nature is to be random and uncommon from that person and their own experiences through life concepts and language. An important quality of random humor is also that it incorporates use of both intelligent and casual vocabulary for a mix of supposedly smart and regular words to create a diverse and interesting language to listen to. It is uncommon for random humor to exclude generic concepts, like Some Guy instead of a specific person, but often random humor uses concepts directly taken from parodies and political satire.
Source of randomness
- Randomness coming from the environment (for example, brownian motion, but also hardware random number generators)
- Randomness coming from the initial conditions. This aspect is studied by chaos theory, and is observed in systems whose behaviour is very sensitive to small variations in initial conditions (such as pachinko machines, dice ...).
- Randomness intrinsically generated by the system. This is also called pseudorandomness, and is the kind used in pseudo-random number generators. There are many algorithms (based on arithmetics or cellular automaton) to generate pseudorandom numbers. The behaviour of the system can be determined by knowing the seed state and the algorithm used. This method is quicker than getting "true" randomness from the environment.
In practice, these sources of randomness often act together.
The mathematical theory of probability arose from attempts to formulate mathematical descriptions of chance events, originally in the context of gambling but soon in connection with situations of interest in physics. Statistics is used to infer the underlying probability distribution of a collection of empirical observations. For the purposes of simulation it is necessary to have a large supply of random numbers, or means to generate them on demand.
Algorithmic information theory studies, among other topics, what constitutes a random sequence. The central idea is that a string of bits is random if and only if it is shorter than any computer program that can produce that string (Chaitin-Kolmogorov randomness) - this basically means that random strings are those that cannot be compressed. Pioneers of this field include Andrey Kolmogorov, Ray Solomonoff, Gregory Chaitin, Anders Martin-Löf, and others.
In communication theory
Successful communication in the real world depends, at the limit, on understanding and successfully minimizing the deleterious effects of assorted interference sources, many of which are apparently random. Such noise imposes performance limits on any communications channel and it was the study of those limits which led Shannon to develop information theory, make fundamental contributions to communication theory, and establish a theoretical grounding for cryptography.
- Hardware random number generator
- Information entropy
- Probability theory
- Pseudorandom number generator
- Random number
- Random sequence
- Random variable
- Stochastic process
Applications and use of randomness
- Main article: Applications of randomness
"Unpredictable" random numbers were first investigated in the context of gambling, and many randomizing devices such as dice, shuffling playing cards, and roulette wheels, were first developed for use in gambling. Fairly produced random numbers are vital to electronic gambling and ways of creating them are sometimes regulated by governmental gaming commissions.
"Random" numbers are also used for non-gambling purposes, both where their use is mathematically important, such as sampling for opinion polls, and in situations where "fairness" is approximated by randomization, such as selecting jurors and military draft lotteries.
- Main article: Random number generation
The many applications of randomness have led to many different methods for generating random data. These methods may vary as to how unpredictable or statistically random they are, and how quickly they can generate random numbers.
Before the advent of computational random number generators, generating large amount of sufficiently random numbers (important in statistics) required a lot of work. Results would sometimes be collected and distributed as random number tables.
- See also: Randomization
- "God doesn't play dice with the universe." —Albert Einstein
- "Random numbers should not be generated with a method chosen at random." —Donald E. Knuth
- "The generation of random numbers is too important to be left to chance." —Robert R. Coveyou, Oak Ridge National Laboratory, 1969
- "That which is static and repetitive is boring. That which is dynamic and random is confusing. In between lies art." —John A. Locke
- "Perhaps our thinking exemplifies a selective system. First lots of random scattered ideas compete for survival. Then comes the selection for what works best —one idea dominates, and this is followed by its amplification. Perhaps the moral [...] is that you never learn anything unless you are willing to take a risk and tolerate a little randomness in your life." —Heinz Pagels, The dreams of reason, 1988
- Randomness by Deborah J. Bennett. Harvard University Press, 1998. ISBN 0674107454
- The Art of Computer Programming. Vol. 2: Seminumerical Algorithms, 3rd ed. by Donald E. Knuth, Reading, MA: Addison-Wesley, 1997. ISBN 0-201-89684-2
- Fooled by Randomness, 2nd Ed. by Nassim Nicholas Taleb. Thomson Texere, 2004. ISBN 158799190X
- Can you behave randomly?
- Chaitin: Randomness and Mathematical Proof
- A Pseudorandom Number Sequence Test Program (Public Domain)
- Dictionary of the History of Ideas: Chance
- Philosophy: Free Will vs. Determinism
- History of randomness definitions, in Stephen Wolfram's A New Kind of Science.da:Tilfældighed