This article defines some terms which characterize probability distributions of two or more variables.
Joint probability is the probability of two events in conjunction. That is, it is the probability of both events together. The joint probability of A and B is written or
Marginal probability is the probability of one event, regardless of the other event. Marginal probability is obtained by summing (or integrating, more generally) the joint probability over the unrequired event. This is called marginalization. The marginal probability of A is written P(A), and the marginal probability of B is written P(B).
In these definitions, note that there need not be a causal or temporal relation between A and B. A may precede B, or vice versa, or they may happen at the same time. A may cause B, or vice versa, or they may have no causal relation at all.
Conditioning of probabilities, ie updating them to take account of (possibly new) information, may be achieved through Bayes' theorem.
If A and B are events, and P(B) > 0, then
Equivalently, we have
If , or equivalently, P(A | B) = P(A), then we say that A and B are independent.
If and , we say that A and B are mutually exclusive events. Then (i.e. the probability of A happening, given that B has happened, is nil since A cannot happen if B happens).
If B is an event and P(B) > 0, then the function Q defined by Q(A) = P(A | B) for all events A is a probability measure.
If P(B) = 0, then P(A | B) is left undefined.
Conditional probability is more easily calculated with a decision tree.
- Bayes' theorem
- Likelihood function
- Posterior probability
- Probability theory
- Monty Hall problemde:Bedingte Wahrscheinlichkeit