Solved – About calculating log-likelihood with zeroes

I would like to use the maximum log-likelihood method to find which continuous uniform distribution with the parameters $a$ and $b$ fits best to some observed data values $(x_{0}, dots, x_{n})$.

I guess the best answer is always $[min(x), max(x)]$ but I am using this for the purpose of implementing the relevant algorithms.

The log-likelihood is equal to $$mathcal{L} = sum_{i=1}^{n} log f(x_{i} | a, b).$$

Interestingly, there is a problem with observations $x_{i} < a$ and $x_{i} > b$ (where the density is zero) since the logarithm is undefined at zero.

What's the preferred way to correct for this? I first though to calculate $log(1 + x_{i})$ (since the optimization algorithm does not care about the plus one) but then I started to think whether some other alternative would be better. Is there any best practice regarding this?

The problem is that the objective function of your optimization problem is not defined for all $a$ and $b$. Let's begin by taking a closer look at $f$: $$ f(x | a, b) = begin{cases} hfill frac{1}{b – a} hfill & text{ if $a leq x leq b$} \ hfill 0 hfill & text{otherwise} \ end{cases} $$ Now let's formulate the optimization problem: $$ underset{a,b}{text{max}} sum log f(x_i | a, b) \ $$ Given the definition of $f$ above, it should be clear that the objective function is only defined when $a leq x_i leq b forall i$. You could address this problem by maximizing the likelihood instead of the log-likelihood, like so: $$ underset{a,b}{text{max}} prod f(x_i | a, b) \ $$ But now you'll have a discontinuity at $a = min x_i$ and $b = max x_i$. Most optimization routines are going to have issues with this. So you need to put bounds on your problem. If you properly bound your problem, you can maximize either the likelihood or the log-likelihood: $$ begin{equation*} begin{aligned} & underset{a, b}{text{max}} & & sum log f(x_i | a, b) \ & text{subject to} & & a leq min x_i, \ &&& b geq max x_i end{aligned} end{equation*} $$

Similar Posts:

Rate this post

Leave a Comment