Conditional distributions

The probability distribution of YY when XX is known to be a specific value is the conditional distribution of YY given XX, if XX and YY are two jointly distributed random variables. Conditional distribution is primarily divided into two classes, as mentioned below:

Conditional discrete distributions

Assume XX and YY are two discrete random variables with probability mass function:

We can determine the conditional probability in terms of discrete random variables if we know the value of YY:

This provides us with the conditional probability mass function of XX given Y=yY=y. This probability can be calculated for any possible xx value.

Example

Suppose XX and YY have joint probability function as:

Compute P(Y=0X=4)P(Y = 0| X = 4).

Let's put the values in the formula mentioned above:

Conditional continuous distributions

If YY is a continuous random variable, the probability associated with any particular value of a continuous distribution is null P(Y=y)=0P(Y = y) = 0. Therefore, we cannot divide by this. Hence, the probability for conditional continuous functions can be calculated by integration. Let's assume that hh is extremely small, and by integrating, we get:

Likewise, when XAX ∈ A that is any event, we can assert that:

Therefore, for a very small hh we have:

Because hh is so small, we consider this to be the conditional probability for the continuous random variable P(XAY=y)P(X ∈ A | Y = y).

Let's evaluate the steps above:

The probability density function for XX conditioned on Y=yY = y can be considered to be the integrand.

Example

Joint density functions of XX and YY are given as:

Where,

Compute P(0.3Y0.4X=0.8)P(0.3 ≤ Y ≤ 0.4 | X = 0.8).

Using the definition given:

Putting values in 4x2y+2y54x^2 y + 2y^5 for XX and YY. For conditioned XX we use 2x2+1/32x^2 + 1/3 function.

Now we'll calculate the values below:

Relation to independence

Two events AA and BB, are said to be independent if P(AB)=P(A)P(B)P(A ∩ B) =P(A) P(B). We have an equivalent definition of independence for the random variables XX and YYas stated below. Intuitively, XX and YY independence indicates that XX and YY do not influence each other. It means that the values of XX do not affect the probability for YY and vice versa.

  • If XX and YY are two discrete random variables, then XX and YY are said to be independent if and only if pYX(yx)=pY(y)p_{Y∣X}(y∣x)=p_Y(y), for every x,yRx,y∈R.

  • If XX and YY are joint absolute continuous random variables, then XX and YY are independent if fYX(yx)=fY(y)f_{Y|X} (y | x) = f_Y (y), for every x,yRx, y ∈ R.

Applications

Conditional distributions are helpful when we collect data for two variables, such as gender and income preference. Still, we are interested in solving probability questions when we know the value of one of the variables. In real life, there are numerous cases where we know the value of one variable and may use a conditional distribution to determine the likelihood of another variable taking on the specific value.

Free Resources

Copyright ©2025 Educative, Inc. All rights reserved