# Conditional probability

This article defines some terms which characterize probability distributions of two or more variables.

Conditional probability is the probability of some event A, assuming event B. Conditional probability is written P(A|B), and is read "the probability of A, given B".

Joint probability is the probability of two events in conjunction. That is, it is the probability of both events together. The joint probability of A and B is written $P(A \cap B)$ or $P(A, \ B).$

Marginal probability is the probability of one event, regardless of the other event. Marginal probability is obtained by summing (or integrating, more generally) the joint probability over the unrequired event. This is called marginalization. The marginal probability of A is written P(A), and the marginal probability of B is written P(B).

In these definitions, note that there need not be a causal or temporal relation between A and B. A may precede B, or vice versa, or they may happen at the same time. A may cause B, or vice versa, or they may have no causal relation at all.

Conditioning of probabilities, ie updating them to take account of (possibly new) information, may be achieved through Bayes' theorem.

## Relations

If A and B are events, and P(B) > 0, then

$P(A\mid B)=\frac{P(A \cap B)}{P(B)}.$

Equivalently, we have

$P(A \cap B)=P(A\mid B)\cdot P(B).$

If $P(A \cap B) = P(A)P(B)$, or equivalently, P(A | B) = P(A), then we say that A and B are independent.

If $P(A \cap B) = 0$ and $P(B) \ne 0$, we say that A and B are mutually exclusive events. Then $P(A\mid B) = 0$ (i.e. the probability of A happening, given that B has happened, is nil since A cannot happen if B happens).

If B is an event and P(B) > 0, then the function Q defined by Q(A) = P(A | B) for all events A is a probability measure.

If P(B) = 0, then P(A | B) is left undefined.

Conditional probability is more easily calculated with a decision tree.