1. Suppose three random variables X , Y , Z have a joint distribution

PX,Y,Z(x,y,z)=PX(x)PZ∣X(z∣x)PY∣Z(y∣z).
Then, are X and Y independent given Z?

2.Suppose random variables X and Y are independent given Z , then the joint distribution must be of the form
PX,Y,Z(x,y,z)=h(x,z)g(y,z),
where h,g are some functions.
(True or False)

1. To determine whether X and Y are independent given Z, we need to check if the joint distribution PX,Y|Z(x,y|z) factorizes into the product of the marginal distributions PX(x) and PY(y) given Z. In this case, the joint distribution is given as:

PX,Y,Z(x,y,z) = PX(x) * PZ|X(z|x) * PY|Z(y|z)

To check for independence, we need to see if we can express this joint distribution as the product of two functions, one only dependent on x and z (h(x,z)), and the other only dependent on y and z (g(y,z)). If we can do that, then X and Y are independent given Z.

To express the joint distribution in that form, we can rearrange the equation as follows:

PX,Y,Z(x,y,z) = PX(x) * PY|Z(y|z) * PZ|X(z|x)

Since the first term PX(x) is independent of y and z, we can assign it to h(x,z). The second term PY|Z(y|z) only depends on y and z, so we can assign it to g(y,z). The third term PZ|X(z|x) only depends on x and z, so we can also assign it to h(x,z). Therefore, we can express the joint distribution as:

PX,Y,Z(x,y,z) = h(x,z) * g(y,z)

Since we can factorize the joint distribution in this way, X and Y are independent given Z.

2. False. If X and Y are independent given Z, the joint distribution is not necessarily of the form PX,Y,Z(x,y,z) = h(x,z) * g(y,z). The joint distribution can take various forms depending on the specific relationship between X, Y, and Z. Independence only implies that the joint distribution factorizes in a way that allows us to express it as the product of conditional distributions, not necessarily as the product of separate functions for each variable.

1. If the joint distribution of X, Y, and Z can be factorized as PX,Y,Z(x,y,z) = PX(x)PZ|X(z|x)PY|Z(y|z), then X and Y are independent given Z.

To show this, we need to verify if the conditional probability PX|Z(x|z) is equal to PX(x) for any value of Z. If this is true, then the conditional distribution of X given Z does not depend on the value of Z, which implies that X and Y are independent given Z.

Using Bayes' theorem, we have:
PX|Z(x|z) = PX,Z(x,z) / PZ(z)
= PX(x)PZ|X(z|x) / PZ(z) (by assumption)
= PX(x)PZ|X(z|x) / ∑X PX(x)PZ|X(z|x) (by the law of total probability)

Since the denominator does not depend on the value of X, we can see that PX|Z(x|z) is equal to PX(x) for any value of Z. Therefore, X and Y are independent given Z.

2. False.
If X and Y are independent given Z, the joint distribution can be factorized as PX,Y,Z(x,y,z) = h(x,z)g(y,z), where h and g are some functions. It is not necessary for the joint distribution to have this form. There could be other ways to factorize the joint distribution.

1. No

2. True