Hello everyone

I have a problem with defining an index based on two sets

my equation is y(j,k,t) equals to Q(k,t+L-1,L) where t and L are both illustrating time but t is from 1 to 10 and L from 1to 3. My problem is with (t+L-1) as I do not know how I can define it in gams. I put the simple version of what I want to write below but I receive error 10 says ‘,’ expected. it means it cannot calculate the t+L-1.

set

t /1*10/

;

Alias(l,t);

parameter

Q(t+l-1,l)

;

Q(t+l-1,l) $( (ord(l)<4) = uniform(100,200);

display Q;

I greatly appreciate any advice because I struggle with this a lot.