I found in my intro to stats textbook that $t$-distribution approaches the standard normal as $n$ goes to infinity. The textbook gives the density for $t$-distribution as follows, $$f(t)=frac{Gammaleft(frac{n+1}{2}right)}{sqrt{npi}Gammaleft(frac{n}{2}right)}left(1+frac{t^2}{n}right)^{-frac{n+1}{2}}$$

I think it might be possible to show that this density converges (uniformly) to the density of normal as $n$ goes to infinity. Given $$lim_{nto infty}left(1+frac{t^2}{n}right)^{-frac{n+1}{2}}=e^{-frac{t^2}{2}}$$, it would be great if we can show

$$frac{Gammaleft(frac{n+1}{2}right)}{Gammaleft(frac{n}{2}right)}to sqrt{frac{n}{2}}$$ as $nto infty$, yet I am stuck here. Can someone point out how to proceed or an alternative way to show that $t$-distribution converges to normal as $nto infty$.

**Contents**hide

#### Best Answer

Stirling's approximation gives $$Gamma(z) = sqrt{frac{2pi}{z}},{left(frac{z}{e}right)}^z left(1 + Oleft(tfrac{1}{z}right)right)$$ so

$$frac{Gamma(frac{n+1}{2})}{Gamma(frac{n}{2})} = dfrac{sqrt{frac{2pi}{frac{n+1}{2}}},{left(frac{frac{n+1}{2}}{e}right)}^{frac{n+1}{2}}}{sqrt{frac{2pi}{frac{n}{2}}},{left(frac{frac{n}{2}}{e}right)}^{frac{n}{2}}}left(1 + Oleft(tfrac{1}{n}right)right)\= {sqrt{frac{frac{n+1}{2}}{e}}}left(1+frac1nright)^{frac{n}{2}}left(1 + Oleft(tfrac{1}{n}right)right) \= sqrt{frac{n}{2}} left(1 + Oleft(tfrac{1}{n}right)right)\ to sqrt{frac{n}{2}}$$ and you may have a slight typo in your question

In fact when considering limits as $nto infty$, you should not have $n$ in the solution; instead you can say the ratio tends to $1$ and it turns out here that the difference tends to $0$. Another point is that $sqrt{frac{n}{2}-frac14}$ is a better approximation, in that not only does the difference tend to $0$, but so too does the difference of the squares.

### Similar Posts:

- Solved – Proving $Gammaleft(frac{1}{2}right)=sqrtpi$ using the expected value of standard normal variable
- Solved – The probability density function of a rescaled / transformed chi-squared random variable
- Solved – Why does the moment generating function of a chi-squared random variable only exist for $t chi-squared-test integral moment-generating-function moments I have found that for a chi-squared ($n$ degrees of freedom) random variable $X$, $M_X(t)= (1-2t)^{-n/2}$. I am told that this only exists for $t<1/2$. Why is this? Best Answer For $X sim chi_n^2$ we have moment generating function: $$begin{equation} begin{aligned} M_X(t) equiv mathbb{E}(exp (tX)) &= int limits_0^infty exp(tx) cdot text{Chi-Sq}(x | n) dx \[8pt] &= frac{1}{2^{n/2} Gamma(n/2)} int limits_0^infty exp(tx) cdot x^{n/2-1} exp(-x/2) dx \[8pt] &= frac{1}{2^{n/2} Gamma(n/2)} int limits_0^infty x^{n/2-1} exp((t -tfrac{1}{2})x) dx. \[8pt] end{aligned} end{equation}$$ For the case where $t < tfrac{1}{2}$, using the change-of-variable $r = (tfrac{1}{2} – t)x$ we have: $$begin{equation} begin{aligned} M_X(t) &= frac{1}{2^{n/2} Gamma(n/2)} int limits_0^infty x^{n/2-1} exp((t -tfrac{1}{2})x) dx. \[8pt] &= (tfrac{1}{2} – t)^{-n/2} cdot frac{1}{2^{n/2} Gamma(n/2)} int limits_0^infty r^{n/2-1} exp(-r) dr. \[8pt] &= (1 – 2t)^{-n/2} cdot frac{1}{Gamma(n/2)} int limits_0^infty r^{n/2-1} exp(-r) dr. \[8pt] &= (1 – 2t)^{-n/2}. \[8pt] end{aligned} end{equation}$$ For the case where $t geqslant tfrac{1}{2}$ we have: $$begin{equation} begin{aligned} M_X(t) &= frac{1}{2^{n/2} Gamma(n/2)} int limits_0^infty x^{n/2-1} exp((t -tfrac{1}{2})x) dx. \[8pt] &geqslant frac{1}{2^{n/2} Gamma(n/2)} int limits_0^infty x^{n/2-1} dx. \[8pt] &= frac{1}{2^{n/2} Gamma(n/2)} Bigg[ frac{2}{n} x^{n/2} Bigg]_{x=0}^{x rightarrow infty} \[8pt] &= frac{1}{2^{n/2} Gamma(n/2)} cdot infty = infty. \[8pt] end{aligned} end{equation}$$ So, you can see that in this latter case, the integral defining the moment generating function diverges to infinity. Because the value infinity is not in the set of real numbers, in such cases it is standard to say that the moment generating function "does not exist", meaning that there is no function $M_X: mathbb{R} rightarrow mathbb{R}$ that satisfies the requirements over the argument values $t geqslant tfrac{1}{2}$. Related Solutions Solved – Finding the Moment Generating Function of chi-squared distribution Yes, since $chi^2$ is a sum of $Z_i^2$ the MGF is a product of individual summands. But then you need the MGF of $Z_i^2$ which is $chi^2$ with 1 degree of freedom. The obvious way of calculating the MGF of $chi^2$ is by integrating. It is not that hard: $$Ee^{tX}=frac{1}{2^{k/2}Gamma(k/2)}int_0^infty x^{k/2-1}e^{-x(1/2-t)}dx$$ Now do the change of variables $y=x(1/2-t)$, then note that you get Gamma function and the result is yours. If you want deeper insights (if there are any) try asking at http://math.stackexchange.com. Solved – Moment Generating Function for Gamma Distribution First, you've mixed up your variables, using $r$ in some cases and $t$ in others. I will use $t$. Next, you should think about what it means to evaluate an MGF at a point. Recall that $$M_X(t) = {rm E}[e^{tX}].$$ So if $X sim {rm Gamma}(2,1)$, then in order for the MGF of $X$ at, say, $t = 2$ to be defined, we would require the integral $${rm E}[e^{2X}] = int_{x=0}^infty e^{2x} xe^{-x} , dx$$ to be convergent. But it isn't, so even though the formula $M_X(t) = (1-t)^{-2}$ is valid for $t < 1$, it isn't valid when $t > 1$ because in order to have obtained that expression, we had to impose a condition on the value of $t$ for which the integral converges.
- Solved – Gaussian Distribution: How to calculate the Cumulative Distribution Formula (CDF) from the Probability Density Function (PDF)? + Error Function?
- Solved – Correct formula for standardized Student’s t-distribution