We consider a sparse autoregressive time series of length 1000 obeying the model
with nonzero coefficients at lags 1,3,5,10 and 15,where the innovations Z(t) are i.i.d. Gaussians with mean zero and standard deviation 0.1.
The question is how to simulate 1000 time series from the model with R or SAS?
In R, fill a vector with Gaussian white noise of zero mean and sd 0.1, then use the
filter function. Notice that since the first 15 observations are white noise (and thence not generated by your model), you have to discard a section of your generated observations (say 10 times the maximum lag).