4. Let X1…Xn be i.i.d. normal variable following the distribution N(u,t) where u is the mean and t is the variance.

Denote by u and t the maximum likelihood estimators of and respectively based on the i.i.d. observations .
(In our usual notation, t=o^2 . We use t in this problem to make clear that the parameter being estimated is o^2 not o.
a)

Is the estimator 2u^2+t de 2u^2+1 asimptotically normal.
Yes
No
Let
g(u,t)=2u^2+t
and let be the Fisher information matrix of X1~N(u,t)
The asymptotic variance of is...
• Nabla g(u,t)^T I(u,t) nabla(u,t)
•Nabla g(u,t)^T (I(u,t)^-1 Nabla g(u,t)
•Nabla g(u,t)^T I(u,t)
•Nabla g(u,t)^T(I(u,t))



c. Using the results from above and referring back to homework solutions if necessary, compute the asymptotic variance of the estimator .
Hint: The inverse of a diagonal matrix where is the diagonal matrix .

V(2u^2+t) or the estimator 2u^2+t

Usar la inversa de (a 0, 0 a) es (1/a 0, 0 1/a)

To determine whether the estimator 2u^2 + t is asymptotically normal, we need to compute the asymptotic variance of the estimator.

a) To calculate the asymptotic variance, we use the Fisher information matrix. The Fisher information matrix, denoted as I(u, t), is defined as the negative second derivative of the log-likelihood function.

b) The log-likelihood function, denoted as L(u, t), can be computed as the sum of the log-likelihoods based on the i.i.d. observations X1, X2, ..., Xn. Since each Xi follows a normal distribution N(u, t), the log-likelihood function for a single observation Xi is given by:

log(L(u, t; Xi)) = -(1/2) * log(2π) - (1/2) * log(t) - (1/2) * (Xi - u)^2 / t

c) Taking the derivative of the log-likelihood function with respect to u and t, we get:

dL/du = -(1/t) * (Xi - u)
dL/dt = -(1/2t) + (1/2t^2) * (Xi - u)^2

d) The Fisher information matrix is obtained by taking the expected value of the negative Hessian matrix of the log-likelihood function. The Hessian matrix is a matrix of second-order partial derivatives with respect to u and t. For this problem, the Fisher information matrix is a 2x2 matrix with elements:

I(u, t) = [[E(dL^2/du^2), E(dL^2/dudt)],
[E(dL^2/dudt), E(dL^2/dt^2)]]

e) Since the observations X1, X2, ..., Xn are i.i.d. normal variables, we can calculate the expected values by substituting E(Xi) = u and Var(Xi) = t into the expressions for dL/du and dL/dt, and then taking the second derivatives. The expected values are:

E(dL^2/du^2) = 1/t
E(dL^2/dudt) = 0
E(dL^2/dt^2) = 1/(2t^2)

f) Therefore, the Fisher information matrix is:

I(u, t) = [[1/t, 0],
[0, 1/(2t^2)]]

g) To compute the asymptotic variance of the estimator 2u^2 + t, we use the formula:

Var(2u^2 + t) = nabla g(u, t)^T * I(u, t) * nabla g(u, t)

where nabla g(u, t) is the gradient vector of g(u, t) = 2u^2 + t with respect to u and t. In this case, the gradient vector is:

nabla g(u, t) = [4u, 1]

h) Substituting the values into the formula, we get:

Var(2u^2 + t) = [4u, 1] * [[1/t, 0],
[0, 1/(2t^2)]] * [4u, 1]^T

= [16u^2/t, 1/(2t^2)] * [4u, 1]^T

= 16u^2/t * 4u + 1/(2t^2)

= 64u^3/t + 1/(2t^2)

Therefore, the asymptotic variance of the estimator 2u^2 + t is 64u^3/t + 1/(2t^2).