I am trying reintroduce autocorrelation and heteroskedasticity to my simulated residuals. My simulated (standardized) residuals have the dimension (horizon
, nTrials
, nIndices
).
In order to calculate today's mean
/ variance
(i.e. t
), I need to use the last periods mean
/variance
(i.e. t-1
) as an input. This is where I am stuck, I can't get this part of the code to run. I try to specify the use of last periods value by {t-1}
(for example in R_{t-1}
), but I receive the error message that R_
is undefined.
I would be very happy about any hints on where I am going wrong here.
Carolin
for i=1:nIndices
for j=1:nTrials
for t=1:horizon
R_t = AA_alpha(i,:) + AA_beta(i,:) * R_{t-1} + sqrt(h_{t-1}) * Z(t,j,i);
h_t = AA_alpha1(i,:)+AA_GARCH(i,:)*variances(j,i)+AA_ARCH(i,:)*Z({t-1},j,i) + AA_LEVERAGE*ZCopy{t-1}
TR_t = TR_{t-1} * exp(R_t);
end
end
end
I think you are a bit confused about how matrix indexing works in Matlab.
If understood correctly, you have a variable
TR_t
with which you want to store the value for timet
. You then try to do the following:I will try to explain in words what you are doing here:
There are multiple problems with this statement. First of all, the variable
TR_
doesn't exist because you named itTR_t
. Second of all, you are trying to index this scalar as if it is a cell matrix.Before proceeding, I suggest you carefully study the Matrix Indexing Article and try again.
But just to help you understand what is going on quicker, here is a rewritten version of your code with explanation so you can follow what I'm doing.
I hope that helps with your understanding of how Matlab works. If you want help with your code in general, consider posting your code on Code Review Stack Exchange to get constructive criticism and suggestions for making your code better and cleaner.