I know it is not uncommon to use LASSO for time series. But as LASSO is actually only linear regression or a GLM with a constraint, what about the assumption of independent observations these models have? I don't think it is justified to assume the observations in a time series to be independent, or am I wrong?

**Contents**hide

#### Best Answer

Are you confusing model and estimator here? The linear model with homoskedastic errors as you describe here $$ mathbf{Y} = mathbf{X}boldsymbol{beta} + boldsymbol{varepsilon},;boldsymbol{varepsilon}simtext{N}_n(mathbf{0}, sigma^2mathbf{I}), $$ is a separate concept from the *estimator* used to estimate $boldsymbol{beta}$ from the data ${(Y_i, X_i)}_{i=1}^n,$ such as the OLS/maximum likelihood estimate $$ boldsymbol{hatbeta}_{OLS} = (mathbf{X}^Tmathbf{X})^{-1}mathbf{X}^Tmathbf{Y}, $$ or the LASSO estimate $$ boldsymbol{hatbeta}_{LASSO} = underset{boldsymbol{beta}}{text{argmin}}; sum_{i=1}^n(Y_i – mathbf{X}_iboldsymbol{beta})^2 + lambdasum_{j=1}^pvertbeta_jvert. $$ These estimators do not rely on the assumption of independent observations, the OLS estimator is still the maximum likelihood estimate even if $boldsymbol{varepsilon}simtext{N}_n(mathbf{0}, Sigma),$ where $Sigma$ has nonzero elements that are not on the diagonal (so that the observations are not independent), it just makes it harder to compute the standard errors of the parameter estimates. For the LASSO estimate, we cannot really compute standard errors anyway, so any assumption of dependence between the observations is inconsequential.

### Similar Posts:

- Solved – Prove that the OLS estimator of the intercept is BLUE
- Solved – Prove that the OLS estimator of the intercept is BLUE
- Solved – In linear regression, are the noise terms independent of the coefficient estimators
- Solved – How to derive the covariance matrix of $hatbeta$ in linear regression
- Solved – How to derive the covariance matrix of $hatbeta$ in linear regression