Solved – getting rid of negative predictions in linear regression

I'm working on a linear regression formula for a forecasting model. My model requires non-negative predictions.

The model works well when predictors have big values; however, I get negative predictions pretty often when predictors have small values.

I was hoping to find a solution whereby the model wouldn't be linear but somewhat exponential to converge to 0, but not getting there unless the variables are both 0.

This is an example of my constant and coefficients for two variables:

CONST   -202,4356389 COV        0,741149304 USERS    369,5808457 

As @Nick Cox points out in a comment, if you want your predicted values to always be positive, you don't want linear regression. If the dependent variable is a count (and maybe even if it is not) you could use Poisson regression or negative binomial regression. If it is bounded, you can transform it to 0-1 and then use beta regression. There are other options too.

Or it might be that you want to transform your dependent variable. If your DV is never negative then you can take the log. Then the predicted values on the raw score would never be negative.

Similar Posts:

Rate this post

Leave a Comment