A conditional probability is the probability that one event occurs, given that another event already has. The concept of conditional probability allows us to revise our probability models given new information. We use the notation
to mean "the probability that A occurs after B already has." Here are some Venn diagrams to illustrate the meaning of conditional probability.
First, imagine an experiment with events A and B. There are twelve possible outcomes, including six ways to obtain outcome A and seven ways to obtain B, with three of those shared. In this scenario, there are two outcomes in our sample space that aren't outcomes A or B.

In the sample space, the probabilities of A and B are
$$P(A) = \frac{6}{12} \phantom{00000} P(B) = \frac{7}{12}$$
That's just the number of dots in A divided by 12 dots total in Ω, and so on.
Now let's assume that event B has already occured. If this is true, then three of the events within A are no longer possible, and we have a new set of probabilitie that we'll call conditional probabilities.

The first is
$$P(A|B) = \frac{3}{7}.$$
We read this as "The probability of obtaining outcome A after B has already been obtained." Now in order to calculate the probability of A occuring, once B has already occured, we note that three of the possible ways of obtaining A are ruled out (gray in the figure) because they're not in set B. The probability of obtaining A after B is the three remaining ways of obtaining A divided by the total number of B outcomes, 7. Once B happens, everything changes, as long as A and B are non-disjoint.
The second conditional probability we can write is trivial, but for completeness, notice that $P(B|B) = 1.$ That is, the probability of obtaining an outcome from set B is 1 because that's all we have left.
Now let's do the same conditional probability problem, but this time we won't know the number of elements in each set, only the numerical probabilities of events $A$, $B$ and $A \cap B$. Here's the Venn diagram:

Now our conditional probabilities can be calculated like this. First,
$$P(B|B) = 1,$$
as we would expect. Now the conditional probability $P(A|B),$ "the probability that A will occur if B has already occured," is just the fraction of event B that also contains event A, or
$$ \begin{align} P(A|B) &= \frac{P(A \cap B)}{P(B)} \\[5pt] &= \frac{3/12}{7/12} = \frac{3}{7} \end{align}$$
The probability that event $A$ occurs after event $B$ is the probability that they both occur as a fraction of the total probability of the condition, $B$.
Events may or may not be independent. We might want to know whether the occurrence of one event affects the occurrence of another. Two events, A and C, are independent if the occurrence of one does not affect on the probability of occurrence of the other. That means
$$P(A) = P(A|C) \: \: \color{#E90F89}{\text{or}} \: \: P(C) = P(C|A)$$
In the first equation, we see that the probability that event $A$ occurs is the same as the probability that $A$ occurs after $B$. In other words, it doesn't matter whether $B$ occurs first or not — they're uncoupled or independent. The second equation says the same thing. Here are two examples of a two-dice experiment to illustrate how we can check for independence:
Let A = {die-1 = 1} and B = {die-2 = 1}
$$P(A|B) = \frac{P(A \cap B)}{P(B)} = \frac{\frac{1}{36}}{\frac{1}{6}} = \frac{1}{6} = P(A)$$
The two events are
Let C = {die-1 = 1} and A = {sum of dice = 3}
$$P(A|C) = \frac{P(A \cap C)}{P(C)} = \frac{\frac{1}{36}}{\frac{1}{6}} = \frac{1}{6} \ne P(A)$$
P(A) = 1/18, so these events are
Let's say that a fair coin, with outcomes "heads" (H) and "tails" (T) is tossed three times, and answer these questions:
$$P(HHH) = P(H) \cdot P(H) \cdot P(H) = \left( \frac{1}{2} \right)^3 = \frac{1}{8},$$
so $P(HHH) = \frac{1}{8}.$
(b) Now for the probability of observing exactly one heads (H), we're looking for either of the outcomes HTT, THT or TTH. Just like in the previous calculation, the probabilities of each are $P(HTT) = P(THT) = P(TTH) = \frac{1}{8}.$
The probability of tossing one of these is the sum of three probabilities:
$$P(\text{one heads}) = \frac{1}{8} + \frac{1}{8} + \frac{1}{8} = \frac{3}{8}.$$
(c) Now the third probability is conditional. First, the probability of rolling at least one heads can be thought of in this way.
All possible outcomes of tossing three coins form the set {HHH, HHT, HTH, THH, THT, TTH, HTT, TTT}. All but the last has at least one heads, so the probability of tossing one heads is $P(1H) = \frac{7}{8}.$
Likewise, of eight possible tosses, those that contain two heads are HHT, HTH, THH, HHH, for a probability of $P(2H) = \frac{4}{8} = \frac{1}{2}.$
Now we have
$$P(2H|1H) = \frac{P(2H \cap 1H)}{P(1H)}$$
Now we plug in our probabilities from above. Note that $P(1H \cap 2H)$ is the intersection of the sets {HHH, HHT, HTH, THH, HTT, THT, TTH} and {HHH, HHT, HTH, THH}, which has four members and yields $P(1H \cap 2H) = \frac{4}{8},$ so our final probability is
$$P(2H|1H) = \frac{\frac{4}{8}}{\frac{7}{8}} = \frac{4}{8} \cdot \frac{8}{7} = \frac{4}{7}.$$
Let's say that in your city it is sunny $(S)$ one-third of the days. When it is sunny, there is a 50% chance that traffic will be heavy, but when it's rainy $(!S)$, that chance reduces to 25%. If it's rainy and there is heavy traffic, you will arrive late to work 50% of the time. Otherwise, on sunny days with light traffic, you arrive to work late 1/8 of the time. When it's rainy with no traffic or sunny with traffic, than chance of you being late is 1/4. Now pick any random day and caclulate:
![]()
xaktly.com by Dr. Jeff Cruzan is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License. © 2012-2025, Jeff Cruzan. All text and images on this website not specifically attributed to another source were created by me and I reserve all rights as to their use. Any opinions expressed on this website are entirely mine, and do not necessarily reflect the views of any of my employers. Please feel free to send any questions or comments to jeff.cruzan@verizon.net.