What is joint probability distribution?

A joint probability distribution is a probability distribution that describes the probability of two or more random variables occurring simultaneously. In other words, it provides the probability of events that involve multiple variables occurring together.

For example, let’s consider a fair coin toss experiment, where we roll the die twice to find the probability of getting a tail on coin A and a head on coin B simultaneously. Here, the random variables are the outcomes of coin A and coin B, and we can represent them as X and Y, respectively.

We can represent the joint probability distribution of two variables as P(X,Y)P(X, Y), P(XandY)P(X and Y), or P(P(X\capY), where \cap represents intersection. We can also represent the joint probability distribution of two or more events using Venn diagrams. Considering the variables in the example above, we can represent the joint probability distribution of X and Y using a Venn diagram as follows:

Venn diagram representing the joint probability distribution of X and Y
Venn diagram representing the joint probability distribution of X and Y

Calculating joint probability

Rolling a fair die: Let’s take an example of rolling a fair die, where there are 66 possible outcomes when the die is rolled. The outcomes can be any number between 1,2,3...6{1, 2, 3...6}, having a probability of 1/61/6 each. Now, we want to find the probability of having 33 followed by 66 when the die is rolled twice.

Let XX and YY represent the occurrence of 33 and 66 respectively; we already know that P(X)P(X) = P(Y)P(Y) = 1/61/6. Thus, to calculate the probability of both XX and YY, the multiplication rule should be used, i.e., P(X,Y)=1/61/6=1/36P(X, Y) = 1/6 *1/6 = 1/36.

Tossing a fair coin: Consider the example of flipping a fair coin where the probability of landing on heads HH or tails TT is equal, i.e., P(H)=P(T)=1/2P(H) = P(T) = 1/2. Let’s find the probability of landing on heads followed by tails, when the coin is flipped twice.

P(H,T)=P(H)P(T)=1/4P(H,T) = P(H) * P(T) = 1/4

Exercise

Now, as a small exercise, find the probability of picking up a card that is both black and 66 from the deck of 5252 cards.

Types of joint probability distributions

There are three types of joint probability distribution, which are briefly described below:

  • When all the variables involved in the joint probability are discrete, meaning they can only take on finite or countable values, it’s called the discrete joint probability distribution. Examples include rolling a die and students attending school each day.

  • When all the variables involved in the joint probability are continuous, meaning they can take an infinite set of values falling into any range, it’s called the continuous joint probability distribution. Examples include calculating the height and weight of players in a team.

  • When some of the variables involved in the joint probability are discrete and some are continuous, it’s called the mixed joint probability distribution. Examples include how many bugs are recorded for a software system during a specific period of time.

Properties of the joint probability distribution

Some of the properties of the joint probability distribution are as follows:

Probability density function

Joint probability distribution assigns values to each possible combination of random variables. Let’s consider the example of coin flipping, where the coin was flipped twice. The following table gives values for all possible combinations of coin flips.

X

Y

H

T

H

1/4

1/4

T

1/4

1/4

Marginal probability distribution

The marginal probability distribution of a variable involved in the joint probability distribution function can be obtained by summing or integrating all possible values of other variables. Consider the coin flipping density table above. The marginal probability of X, where X is heads, can be calculated as follows: P(X=H)=P(H,H)+P(H,T)=1/2P(X=H) = P(H,H) + P(H, T) = 1/2

Independence

The independence of joint probability distributions refers to the scenario where the values of variables don’t affect each other. If the variables are independent, the joint probability is the product of individual probabilities.

In the example above of flipping fair coins, since the probability of landing on heads or tails depends on the probability of the coins themselves and doesn’t depend on the outcome of the first flip, the joint probability of flipping coins twice is independent.

Conditional probability

Conditional probability gives the likelihood of the occurrence of an event, given that another event has already occurred. Joint probability can also be used to calculate conditional probability.

Conditional probability is represented as follows:

P(AB)=P(A,B)P(B)P(A|B) = \frac{P(A, B)}{P(B)}

Here,

  • P(AB)P(A|B) is the conditional probability of A, given B.
  • P(A,B)P(A, B) is the joint probability of A and B.
  • P(B)P(B) is the marginal probability of B.

Let’s calculate the conditional probability using the coin flipping example, where we want to find the probability of getting heads on the second flip, when we got tails on the first flip.

We already know that P(T,H)=1/4P(T, H) = 1/4 and P(T)=1/4P(T) = 1/4. Using the formula above, we can calculate P(T,H)P(T, H) as follows:

P(AB)=P(TH)=P(H,T)/P(T)=1/2P(A|B) = P(T|H) = P(H, T)/P(T) = 1/2

Free Resources

Copyright ©2025 Educative, Inc. All rights reserved