I'm novice in time series analysis and completely lost through reading so..

I have an enormous dataset of discrete time series of aggregated events and I wish to fit each one of them into ARIMA models. Therefore I have to ensure that the time series are stationary, which I do through adf and kpss tests. However I wonder if these tests also ensure that time series do not need to be tested for trend and seasonality. Is there any statistical criterion that reveals that a time series is trending?

Visual inspection of time plots is difficult in this case ~~out of the question~~* since the dataset is way too big.

Thanks in advance,

btw I'm an R user 😉

-A

**Contents**hide

#### Best Answer

When faced with the daunting task of evaluating a "zillion" number of time series for local time trends one could spend a lot of time reviewing graphs which may or may not have unusual data points potentially obfuscating the trend or an embedded auto-regressive structure which could also defeat your "eyeballing". A more efficient approach might be to perform a computerized search which could be classified as a productivity aid for your bleary eyes. The suggested procedures to detect time trends and other deterministic structure have been all peer-reviewed and have transparency. These are not ad-hoc methods but state-of-the-art procedures to help you in your qwest. Look at my answer and previous references in Fancy detrending of time series