A joint probability distribution is a probability distribution that describes the probability of two or more random variables occurring simultaneously. In other words, it provides the probability of events that involve multiple variables occurring together.
For example, let’s consider a fair coin toss experiment, where we roll the die twice to find the probability of getting a tail on coin A and a head on coin B simultaneously. Here, the random variables are the outcomes of coin A and coin B, and we can represent them as X and Y, respectively.
We can represent the joint probability distribution of two variables as , , or XY), where represents intersection. We can also represent the joint probability distribution of two or more events using Venn diagrams. Considering the variables in the example above, we can represent the joint probability distribution of X and Y using a Venn diagram as follows:
Rolling a fair die: Let’s take an example of rolling a fair die, where there are possible outcomes when the die is rolled. The outcomes can be any number between , having a probability of each. Now, we want to find the probability of having followed by when the die is rolled twice.
Let and represent the occurrence of and respectively; we already know that = = . Thus, to calculate the probability of both and , the multiplication rule should be used, i.e., .
Tossing a fair coin: Consider the example of flipping a fair coin where the probability of landing on heads or tails is equal, i.e., . Let’s find the probability of landing on heads followed by tails, when the coin is flipped twice.
Now, as a small exercise, find the probability of picking up a card that is both black and from the deck of cards.
There are three types of joint probability distribution, which are briefly described below:
When all the variables involved in the joint probability are discrete, meaning they can only take on finite or countable values, it’s called the discrete joint probability distribution. Examples include rolling a die and students attending school each day.
When all the variables involved in the joint probability are continuous, meaning they can take an infinite set of values falling into any range, it’s called the continuous joint probability distribution. Examples include calculating the height and weight of players in a team.
When some of the variables involved in the joint probability are discrete and some are continuous, it’s called the mixed joint probability distribution. Examples include how many bugs are recorded for a software system during a specific period of time.
Some of the properties of the joint probability distribution are as follows:
Joint probability distribution assigns values to each possible combination of random variables. Let’s consider the example of coin flipping, where the coin was flipped twice. The following table gives values for all possible combinations of coin flips.
X | Y | ||
H | T | ||
H | 1/4 | 1/4 | |
T | 1/4 | 1/4 |
The marginal probability distribution of a variable involved in the joint probability distribution function can be obtained by summing or integrating all possible values of other variables. Consider the coin flipping density table above. The marginal probability of X, where X is heads, can be calculated as follows:
The independence of joint probability distributions refers to the scenario where the values of variables don’t affect each other. If the variables are independent, the joint probability is the product of individual probabilities.
In the example above of flipping fair coins, since the probability of landing on heads or tails depends on the probability of the coins themselves and doesn’t depend on the outcome of the first flip, the joint probability of flipping coins twice is independent.
Conditional probability gives the likelihood of the occurrence of an event, given that another event has already occurred. Joint probability can also be used to calculate conditional probability.
Conditional probability is represented as follows:
Here,
Let’s calculate the conditional probability using the coin flipping example, where we want to find the probability of getting heads on the second flip, when we got tails on the first flip.
We already know that and . Using the formula above, we can calculate as follows:
Free Resources