From Casella's Statistical Inference:

Definition 10.1.7 For an estimator $T_n$, if $lim_{nto infty} k_n Var T_n = tau^2 < infty$, where ${k_n}$ is a sequence of

constants, then $tau^2$ is called**the limiting variance**or

**limit of the variances**of $T_n$.Definition 10.1.9 For an estimator $T_n$, suppose that $k_n(T_n – tau(theta)) to n(0, sigma^2)$ in distribution. The parameter

$sigma^2$ is called**the asymptotic variance**or**variance of**of $T_n$.

the limit distributionI was wondering if both definitions depend on the choice of the

sequence $k_n$, because I suspect for some choice of the sequence

$k_n$, the convergence may fail, while for some other choice of the

sequence $k_n$, the convergence may succeed. Then are the two

definitions not well defined, because aren't they supposed to be not

dependent on the choice of the sequence $k_n$?For example, in Lyapunov CLT, $frac{1}{s_n} sum_{i=1}^{n} (X_i – mu_i) xrightarrow{d} mathcal{N}(0,;1)$ where $ s_n^2 = sum_{i=1}^n sigma_i^2 $. According to the above definition of asymptotic variance, $T_n = sum_{i=1}^n X_i$, $tau(theta) = sum_{i=1}^n mu_i$ (should tau(theta) be independent of sample size $n$?), and the asymptotic variance of $sum_{i=1}^n X_i$ is $1$ (this is hard to believe, because the variance $sigma_i^2$ of $X_i$ can be any as long as it is finite)?

Can the limiting distribution in the definition of the asymptotic

variance to be other than a Normal distribution?When will the limiting variance and the asymptotic variance be the

same?

Similarly but more generally,

how can we define limiting moments and

asymptotic moments?Is the limiting distribution in the definition of an asymptotic

moment required to be a Normal distribution?When will the limiting moment and the asymptotic moment coincide?

For example, those two concepts for means: limiting mean and

asymptotic mean?

Thanks and regards!

**Contents**hide

#### Best Answer

**Asymptotic Moments**

Let ${X_n}$ be a sequence of random variables with cumulative distribution function $F_n(x)$. If this sequence converges in distribution to a random variable $X$, $lim_{nrightarrow infty}F_n(x) = F(x)$ for every point of continuity of $F(x)$, then the asymptotic moments of ${X_n}$ are the moments of the limiting distribution $F(x)$.

**Limiting moments**

Let ${X_n}$ be a sequence of random variables with cumulative distribution function $F_n(x)$. For every moment $M_{n,r}$ of $F_n(x)$ that exists, the limiting moment is defined as $M_r equiv lim_{nrightarrow infty}M_{n,r}$.

**When the two coincide?**

If

1) $M_r equiv lim_{nrightarrow infty}M_n(r)$ is finite (i.e. if the limiting moment is a real number)

2) There exists $delta > 0 : Eleft(|X_n|^{r+delta}right) < M < infty;; forall n$

then, if $X_n rightarrow_d X$, the limiting moment $M_r$ will be the asymptotic moment also.

### Similar Posts:

- Solved – Limiting variance of normal mean
- Solved – Show that $hattheta=frac{2 bar Y- 1}{1- bar Y}$ is an unbiased estimator for $theta$
- Solved – the logic behind method of moments
- Solved – How to determine the estimator of the asymptotic variance of the MLE estimator of the Pareto distribution
- Solved – the difference/relationship between method of moments and GMM