Calculate the autocorrelation of a time series created from a normal distribution

3k views Asked by At

I generate a time series from a normal distribution and then I try to plot the autocorrelation by using the following code snippet:

ts1 = normrnd(0,0.25,1,100);
autocorrelation_ts1 = xcorr(ts1);

I was expecting that the autocorrelation would show 1 for x=0 and almost 0 for the rest of values, instead I get value 6 at axis position 100.

I think the question applies both to Matlab and Octave but I am not sure.

autocorrelation

2

There are 2 answers

4
Scott On BEST ANSWER

First thing is that your second line of code is wrong. I think you meant to put

autocorrelation_ts1 = xcorr(ts1);

Other than this, I think your solution is correct. The reason the max value is at 100 and not 0 is because a temporal shift of 0 in the autocorrelation actually happens on the 100th iteration of the correlation function. In other words, the numbers on the X axis don't correspond to time.

To get time on the X axis change your code to

[autocorrelation_ts1, shifts] = xcorr(ts1);

Then

plot(shifts, autocorrelation_ts1)

With regard to the max value, matlab documentation for xcorr indicates that 1 is not the maximum output value of the function when called without the normalization argument. If you want to normalize such that all values are 1 or less, use

[autocorrelation_ts1, shifts] = xcorr(ts1, 'normalized');
0
M.E. On

Just as complementary reference to Scott's answer, this is the complete code snippet, including stem chart scaling to show up to 20 shifts/lags.

[auto_ts1, lags] = xcorr(ts1);
ts_begin = ceil(size(lags,2)/2);
ts_end = ts_begin + 20;
stem(lags(ts_begin:ts_end),auto_ts1(ts_begin:ts_end)/max(auto_ts1), 'linewidth', 4.0, 'filled')