Problem 1. Rice, Chapter 8, Exercise 7, parts (a) through (c) only, page 314.
Problem 2. Rice, Chapter 8, Problem 13.
Problem 3 (Is the MLE for i.i.d exponential data asymptotically normal?). Let Xi , 1 ≤ i ≤ n, be i.i.d exponential with parameter λ > 0.
(a) Does the support of this distribution depend on λ?
(b) Compute the maximum likelihood estimate for λ, λˆ.
(c) Consider the function g(x) = 1/x. Construct a second order Taylor expansion of this function around the value 1/λ (why?), similar to the more general case you considered in problem on Taylor expansions from the previous homework.
(d) Suppose the true value of λ is λ = λ0. Use this Taylor expansion to determine the asymptotic distribution
of √ n(λˆ − λ0)
(e) Compute the Fisher information I(λ0) and determine whether your answers to the previous part agree with the asymptotic normality results we described in class.
Problem 4 (MLE of uniform distribution). Rice, Exercise 53, page 324: we worked through the essential ideas in lecture!
Problem 5 (More on MLE of a uniform distribution). Let Xi be i.i.d uniform on [0, θ]. Let ˆθn be the MLE for θ that you obtained from the previous exercise.
a) Show that P[ ˆθn − θ > ] = 0 for any > 0.
b) For any > 0, determine an explicit expression for the probability P[| ˆθ − θ| > ] 1
c) Compute, for any > 0, the limitnP[| √ n( ˆθn − θ)| > ] as n → ∞.
d) What do your previous answers suggest about the asymptotic distribution of
√ n( ˆθn − θ)?
In particular, does this still look approximately normal?
e) What is the Method-of-Moments estimate for θ, ˆθMOM? What is the limiting distribution of √ n( ˆθMOM − θ0) where θ0 is the true value of the parameter?
Problem 6. Choose the correct answer and justify your choice completely.
Suppose Xn, n ≥ 1, is a sequence of random variables on a probablity space ?. Suppose all the random variables Xn have common mean E[Xn] = µ. Suppose X is also a random variable defined on ?. Suppose that for all n, the variance V (Xn) ≤ M, where M is a fixed constant. Which of the following is true?
(a) Because the variances of the Xn random variables are all bounded by a constant M, this guarantees, by Chebyshev’s inequality, that Xn converges in probability to the common mean µ.
(b) Because the variances of Xn random variables are all bounded by a constant M, this guarantees, by Chebyshev’s inequality, that Xn converges to µ in L 2
(c) The only way to guarantee either of the options in (a) and (b) is to require that the random variables
Xn be independent.
(d) If there exists a single sample point ω ∈ ? such that the sequence Xn(ω) converges to µ, then we know that Xn converges to µ with probability one.
(e) Both (a) and (b) are true.
(f) All of (a), (b), and (d) are true.
(g) None of the above.
Problem 7. Choose the correct answer, and justify your response completely. Suppose ? = [0, 1], and let
ω ∈ ? be drawn uniformly at random from the unit interval. Define the sequence of random variables Xn,
n ≥ 1, on ? by Xn(ω) = n α I [0, 1 n2 ] (ω) + n α I [1− 1 n ,1] (ω)
Here α ∈ R is a fixed, finite real number, and the notation IA above denotes the indicator function of the set A. Let X be the random variable defined by X(ω) = 0 for all ω ∈ [0, 1]. Which of the following is true?
(a) For α = 1, the expected value of Xn is E[Xn] = 1 + 1/n.
(b) Regardless of the value of α, Xn converges to X in probability.
(c) Regardless of the value of α, Xn cannot converge to X with probability one because for any δ > 0, the set {ω : |Xn(ω) − X(ω)| > δ} has positive probability.
(d) If α = 1, then for any n > 51, we are guaranteed that |Xn(ω) − X(ω)| < 1/50 no matter what the value of ω ∈ [0, 1] happens to be. 2
(e) Both (a) and (b) are true.
(f) Both (a) and (c) are true.
(g) Both (b) and (c) are true.
(h) All three of (a), (b), and (c) are true.
(i) All three of (a), (b), and (d) are true.
(j) All four of (a), (b), (c), and (d) are true.
(k) None of the above.