A) Let x1....xn be i.i.d. normal variable following the distribution , where is the mean and is the variance.

Denote by u and t the maximum likelihood estimators of u and t respectively based on the i.i.d. observations .
Is the stimator 2u^2+t asimptoticaly normal
1Yes
2No
3Ninfo

(In our usual notation, . We use in this problem to make clear that the parameter being estimated is not .)

B) Let g(u,t)=2u^2+t and let be the Fisher information matrix of xi~N(u,t).

The asymptotic variance of 2u^2+t is...
1) \nabla g(\mu ,\tau )^ T \mathbf{I}(\mu ,\tau ) \nabla g(\mu ,\tau )
2)\nabla g(\mu ,\tau )^ T \left(\mathbf{I}(\mu ,\tau )\right)^{-1} \nabla g(\mu ,\tau )
3) \nabla g(\mu ,\tau )^ T \mathbf{I}(\mu ,\tau )
4)\nabla g(\mu ,\tau )^ T \left(\mathbf{I}(\mu ,\tau )\right)^{-1}

C)
Using the results from above and referring back to homework solutions if necessary, compute the asymptotic variance V(2u^2+t) of the estimator 2u^2+t .

Hint: The inverse of a diagonal matrix a 0
0 a
where is the diagonal matrix

1/a 0
1/b 0

A) To determine if the estimator 2u^2 + t is asymptotically normal, we need to check if it satisfies the conditions of asymptotic normality.

The conditions for asymptotic normality are:
1. Consistency: The estimator converges in probability to the true value of the parameter.
2. Asymptotic Normality: The estimator is asymptotically normally distributed.

To check for consistency, we need to ensure that the maximum likelihood estimators (u_hat and t_hat) converge in probability to their true values.

To check for asymptotic normality, we need to compute the Fisher information matrix and use it to determine the asymptotic variance of the estimator.

B) The Fisher information matrix (I) is defined as the negative expected value of the second derivative of the log-likelihood function with respect to the parameters. In this case, since we have independent and identically distributed (i.i.d.) normal variables, the log-likelihood function can be expressed as:

ln L(u, t) = -n/2 ln(2π) - (n/2) ln(t) - (1/2t) ∑(xi - u)^2

Differentiating twice with respect to u and t, we can compute the elements of the Fisher information matrix as follows:

I(u, t) = [∂^2ln L(u, t)/∂u^2 ∂^2 ln L(u, t)/∂u∂t]
[∂^2 ln L(u, t)/∂t∂u ∂^2 ln L(u, t)/∂t^2]

Simplifying the second derivatives, we get:

I(u, t) = [n/t 0]
[0 n/(2t^2)]

C) Now that we have the Fisher information matrix, we can compute the asymptotic variance of the estimator 2u^2 + t.

Using the multivariate delta method, the asymptotic variance-covariance matrix (V) of a function g(u, t) can be computed as follows:

V(g) = (∇g)^T I(u, t)^-1 (∇g)

where ∇g represents the gradient of g(u, t) and I(u, t)^-1 represents the inverse of the Fisher information matrix.

In this case, the function g(u, t) = 2u^2 + t.

Taking the gradient of g(u, t), we get:

∇g = [4u 1]

Plugging in the values, the asymptotic variance of the estimator 2u^2 + t can be computed as:

V(2u^2 + t) = [4u 1] [n/t 0]^-1 [4u 1]^T

Simplifying the inverse, we have:

V(2u^2 + t) = [4u 1] [t/n 0] [4u 1]^T

Expanding and simplifying the matrix multiplication, we get:

V(2u^2 + t) = [4u(t/n) + 1 1]

So, the asymptotic variance of the estimator 2u^2 + t is [4u(t/n) + 1 1].