I'm looking at a time series which has a very strong daily cycle in it. However, on top of having a daily cycle in the actual values of the time series, it also has a very strong daily variance cycle.

I am wondering if I can meaningfully remove the 'variance trend' from my time series. I can calculate the 'average variance' for a given hour and then divide by it, but I'm not sure I'm doing anything useful if I do this?

The large variance does correlate with large values in the daily cycle too, but taking a log doesn't seem to help much.

EDIT: EXTRA INFORMATION

As pointed out in the comments, this problem is rather under-defined. Unfortunately, this is a symptom I am also dealing with and so I can't fix that. However, I will try to give some more information.

The **goal** is to have a model of the time series from which realisations can be drawn. The time series data is a measure of concentration over time, with clear daily and yearly cycles and a linear trend. There are many covariate time series which at least follow the same daily and yearly cycles.

I am currently just trying to explore and learn different time series techniques. I wasn't hoping for a 'solution' but rather for a bit of direction as I hadn't been able to find much information about trends in the variance.

**Contents**hide

#### Best Answer

No one will be able to help you without an example of the data or more information, however a few things can be said. I will not elaborate on the algebra of things though, you best research that for yourself.

The books of Greene and/or Hayashi concerning Econometrics are a good source about the topics of heteroscedasticity, autocorrelation and (F)GLS. An easier example is Stock/Watson.

First, the variance trend can probably be modeled by a matrix of covariances, ie. $V[epsilon|X] = Omega$

In that case you can use the GLS estimator to eliminate the heteroscedasticity. Especially useful to you should be the FGLS techniques. In that case you estimate the matrix from residuals of a regular OLS and then iteratively refine your estimates. Wikipedia FGLS

As far as your dependency on the magnitude of your data is concerned, this is also eliminated by FGLS. You can see that the GLS estimator (in terms of General Method Of Moments) comes down to the following problem:

${E}[,x_t(y_t – x_t'beta)/sigma^2(x_t),]=0$

So you should transform your model accordingly (this is equivalent to using the GLS estimator, using the matrix $Omega$

In addition to the wikipedia article I wrote about GLS / heteroscedasticity here

If you are using R, another strategy can be to use Newey-West variance/covariance robust error term estimators. This can be achieved by adding `vcov=vcovHAC`

as an argument to your R-functions. If your dataset is large enough, this will give you consistent estimates of your error terms.