Let be independent discrete random variables with E[X]=2, E[Y]=0, E[Z]=0, E[X^2]=20, E[Y^2]=E[Y^2]=16, and Var(X)=Var(Y)=Var(Z)= 16. Let A=X(Y+Z) and B=XY.

1.Find E[B].
2.Find Var(B).
3.Find E[AB].
4. are A and B independent?
5.Are A and B are conditionally independent, given X=0.
6.Are A and B are conditionally independent, given X=1.

sorry i missed the notation of variable at the beginning. Let X,Y,Z be independent discrete random variables cntd..

1. E[B] = E[XY]

Since X and Y are independent, E[B] = E[X] * E[Y] = 2 * 0 = 0.

2. Var(B) = Var(XY)
We know that Var(XY) = E[(XY)^2] - (E[XY])^2

To find E[(XY)^2], we can use the formula:
E[(XY)^2] = E[X^2 Y^2] + 2E[XY] E[X] E[Y]

Given E[X^2] = 20, E[Y^2] = 16, E[Y^2] = 16, and E[XY] = 0 (since X and Y are independent),
we can calculate E[(XY)^2] as follows:
E[(XY)^2] = E[X^2 Y^2] + 2(0)(2)(0)
= E[X^2 Y^2]

Var(B) = E[(XY)^2] - (E[XY])^2
= E[X^2 Y^2] - 0^2
= E[X^2 Y^2]

However, we need additional information about the joint distribution of X and Y to calculate E[X^2 Y^2] and hence Var(B).

3. E[AB] = E[X(Y+Z)XY]
Since X, Y, and Z are independent, we can simplify E[AB]:
E[AB] = E[X^2 Y^2 + X^2 YZ]
= E[X^2]E[Y^2] + E[X^2]E[YZ]
= 20(16) + 20(0)
= 320

4. To determine if A and B are independent, we need information about the joint distribution of X, Y, and Z. The given information doesn't provide enough details to determine independence.

5. To determine if A and B are conditionally independent given X = 0, we would need information about the conditional joint distribution of X, Y, and Z given X = 0. The given information doesn't specify the conditional distribution, so we cannot determine conditional independence.

6. Similarly, to determine if A and B are conditionally independent given X = 1, we would need information about the conditional joint distribution of X, Y, and Z given X = 1. The given information doesn't specify the conditional distribution, so we cannot determine conditional independence.

To solve this problem, we can use the properties of expected value and variance. Let's first calculate the required values step-by-step.

1. Find E[B]:
Since X and Y are independent variables, we can use the linearity property of expected value:
E[B] = E[XY] = E[X]E[Y]
E[B] = 2 * 0
E[B] = 0

2. Find Var(B):
Using the formula for variance, Var(B) = E[B^2] - (E[B])^2
We already calculated E[B] to be 0, so we only need to find E[B^2]:
E[B^2] = E[(XY)^2]
Expanding the square:
E[B^2] = E[X^2 Y^2] = E[X^2]E[Y^2]
E[B^2] = 20 * 16
E[B^2] = 320

Now we can find Var(B):
Var(B) = E[B^2] - (E[B])^2
Var(B) = 320 - (0)^2
Var(B) = 320

3. Find E[AB]:
E[AB] = E[X(Y+Z)] = E[XY + XZ]
Since X and Y are independent, we already know that E[XY] = 0 (combining results from step 1).

E[AB] = E[XZ]
Since X and Z are independent, we can use the linearity property:
E[AB] = E[X]E[Z]
E[AB] = 2 * 0
E[AB] = 0

4. Are A and B independent?
To check if A and B are independent, we need to verify if E[AB] = E[A]E[B].
Since E[A] = E[X(Y+Z)] and E[B] = E[XY], we need to compare:

E[AB] = 0
E[A]E[B] = E[X(Y+Z)]E[XY]

We can see that E[AB] = 0 but E[A]E[B] ≠ 0, so A and B are not independent.

5. Are A and B conditionally independent given X = 0?
To check conditional independence, we need to determine if the joint distribution of A and B conditional on X = 0 is the same as their marginal distribution. Since X = 0, we have:

A = 0(Y+Z) = 0
B = 0Y = 0

The joint distribution of A and B, conditional on X = 0, is simply the probability distribution of B.

Since A and B are both 0 when X = 0, their joint distribution is the same as their marginal distribution. Therefore, A and B are conditionally independent given X = 0.

6. Are A and B conditionally independent given X = 1?
To check conditional independence, we need to determine if the joint distribution of A and B conditional on X = 1 is the same as their marginal distribution. Since X = 1, we have:

A = 1(Y+Z)
B = 1Y = Y

The joint distribution of A and B, conditional on X = 1, is the probability distribution of A multiplied by the probability distribution of B|X = 1.

Since A and B are functions of Y and Z, and the conditional probability distributions of Y and Z given X = 1 are not affected by each other, A and B are conditionally independent given X = 1.

Note: In general, to determine conditional independence, we need to consider the conditional probability distribution and check if it factors into the product of the individual conditional probability distributions.

To solve these problems, we will need to use the properties of expected value, variance, and independence. Let's break down each question and explain how to answer them step by step.

1. To find E[B], we first need to find the joint probability distribution for X and Y. Since X and Y are independent, the joint probability distribution is the product of their individual probability distributions.

2. The expected value of B can be calculated using the law of the unconscious statistician:

E[B] = E[XY] = E[X] * E[Y]. (Since X and Y are independent, their cross term expectation goes to 0.)

Given that E[X] = 2 and E[Y] = 0, we can substitute these values:

E[B] = 2 * 0 = 0.

Therefore, E[B] = 0.

2. To find Var(B), we can use the definition of variance:

Var(B) = E[B^2] - (E[B])^2.

We know that E[B] = 0, so we just need to find E[B^2]:

E[B^2] = E[(XY)^2] = E[X^2 * Y^2] (since X and Y are independent)

Given that E[X^2] = E[Y^2] = 16, we can substitute these values:

E[B^2] = 16 * 16 = 256.

Now we can substitute the values into the variance formula:

Var(B) = 256 - (0)^2 = 256.

Therefore, Var(B) = 256.

3. To find E[AB], we can use the linearity of expectation:

E[AB] = E[X(Y + Z) * XY] = E[X^2 * Y^2 + XZ * Y^2].

Since X and Y are independent, we can treat them separately:

E[X^2 * Y^2] = E[X^2] * E[Y^2],

E[Z * Y^2] = E[Z] * E[Y^2].

Given that E[X^2] = E[Y^2] = 16, and E[Z] = 0, we can substitute these values:

E[X^2 * Y^2] = 16 * 16 = 256,

E[Z * Y^2] = 0 * 16 = 0.

Now we can substitute the values into the expectation formula:

E[AB] = 256 + 0 = 256.

Therefore, E[AB] = 256.

4. We can determine if A and B are independent by checking if their joint probability distribution is equal to the product of their individual probability distributions. To do this, we can consider the joint probability distribution of A and B and check if it factors into the product of their individual probability distributions.

5. To determine if A and B are conditionally independent given X = 0, we need to check if the conditional probability distribution of A given X = 0 factors into the product of the conditional probability distribution of A given X = 0 and the conditional probability distribution of B given X = 0.

We would need additional information about the conditional probability distributions of A and B given X = 0 to answer this question.

6. To determine if A and B are conditionally independent given X = 1, we need to check if the conditional probability distribution of A given X = 1 factors into the product of the conditional probability distribution of A given X = 1 and the conditional probability distribution of B given X = 1.

Again, we would need additional information about the conditional probability distributions of A and B given X = 1 to answer this question.