We consider a sparse autoregressive time series of length 1000 obeying the model

$$X(t)=0.2X(t-1)+0.1X(t-3)+0.2X(t-5)+0.3X(t-10)+0.1X(t-15)+Z(t)$$

with nonzero coefficients at lags 1,3,5,10 and 15,where the innovations Z(t) are i.i.d. Gaussians with mean zero and standard deviation 0.1.

The question is how to simulate 1000 time series from the model with R or SAS?

**Contents**hide

#### Best Answer

In R, fill a vector with Gaussian white noise of zero mean and sd 0.1, then use the `filter`

function. Notice that since the first 15 observations are white noise (and thence not generated by your model), you have to discard a section of your generated observations (say 10 times the maximum lag).