The characteristic function of Fisher $mathcal{F}(1,alpha)$ distribution is:
$$C(t)=frac{Gamma left(frac{alpha +1}{2}right) Uleft(frac{1}{2},1-frac{alpha }{2},-i t alpha right)}{Gamma left(frac{alpha }{2}right)}$$
where $U$ is the confluent hypergeometric function. I am trying to solve the inverse Fourier transform $mathcal {F} _ {t,x}^{-1}$ of the $n$-convolution to recover the density of a variable $x$, that is:
$$mathcal {F} _ {t,x}^{-1} left(C(t)^nright)$$
with the purpose of getting the distribution of the sum of $n$ Fisher-distributed random variables. I wonder if someone has any idea as it seems to be very difficult to solve. I tried values of $alpha=3$ and $n=2$ to no avail.
Note: for $n=2$ by convolution I get the pdf of the average (not sum):
$$frac{3 left(12 left(x^2+3right) left(5 x^2-3right) x^2+9 left(20 x^4+27 x^2+9right) log left(frac{4 x^2}{3}+1right)+2 sqrt{3} left(x^2+15right) left(4 x^2+3right) x^3 tan ^{-1}left(frac{2 x}{sqrt{3}}right)right)}{pi ^2 x^3 left(x^2+3right)^3 left(4 x^2+3right)}$$,
where $x$ is an average of 2 variables. I know it is unwieldy but would love to get an idea of the approximation of the basin distribution.
Best Answer
There is no closed-form density for a convolution of F-statistics, so trying to invert the characteristic function analytically is not likely to lead to anything useful.
In mathematical statistics, the tilted Edgeworth expansion (also known as the saddlepoint approximation) is a famous and often used technique for approximating a density function given the characteristic function. The saddlepoint approximation if often remarkably accurate. Ole Barndorff-Nielsen and David Cox wrote a textbook explaining this mathematical technique.
There are other ways to approach the problem without using the characteristic function. One would expect the convolution distribution to be something like an F-distribution in shape. One might try an approximation like $aF(n,k)$ for the $n$-convolution, and then choose $a$ and $k$ to make the first two moments of the distribution correct. This is easy given the known mean and variance of the F-distribution.
If $alpha$ is large, then the convolution converges to a chisquare distribution on $n$ degrees of freedom. This is equivalent to choosing $a=n$ and $k=infty$ in the above approximation, showing that the simple approximation is accurate for large $alpha$.
Similar Posts:
- Solved – How do gamma distributions add and what would that model
- Solved – the convolution of a normal distribution with a gamma distribution
- Solved – PDF and CDF of sum of two independent $Gamma$-distributed random variables
- Solved – PDF and CDF of sum of random variables with different distributions
- Solved – What happens when merging random variables in Dirichlet distribution