Slow GAMS when looping

Dear GAMS users,

I am running a time series MIP for a planning period, (with the objective of minimizing total costs within tau). I have 365 planning periods and (tau=1 to tau=365). I want to do the optimization successively and I use looping to do so…

In principle, except for the changing parameters and initial conditions, the problem is the same for all tau considered. When running, however, the process keeps on getting slower with each increasing tau and my computer almost freezes at tau=60.



Besides, the .lst was growing so large and I checked on some online help and I found the recommendation that adding


$offsymxref offsymlist

$offlisting

option

limrow = 0,

limcol = 0,

solprint = off,

sysout = off;


could reduce the size of the .lst file and increase the speed.



The size of the .lst file was dramatically low when I followed the recommendations but the speed is still extremely slow for each increase in tau.



Any help is highly appreciated.



Best regards,

Yonas


To unsubscribe from this group and stop receiving emails from it, send an email to gamsworld+unsubscribe@googlegroups.com.
To post to this group, send email to gamsworld@googlegroups.com.
Visit this group at http://groups.google.com/group/gamsworld?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.

\

Dear Yonas,

Can you verify that the problem is indeed the same for each tau considered, by checking the number of constraints and variables at each iteration (these numbers are displayed both onscreen and in the lst file).
When you say the process is getting slower, do you mean that the time spent on actually solving each iteration increases, or that the GAMS execution part is getting slower ?

To go any further, I would need to inspect the code…
Cheers


Le mardi 26 février 2013 17:43:45 UTC+1, Yonas Gebrekiros a écrit :

Dear GAMS users,

I am running a time series MIP for a planning period, (with the objective of minimizing total costs within tau). I have 365 planning periods and (tau=1 to tau=365). I want to do the optimization successively and I use looping to do so…

In principle, except for the changing parameters and initial conditions, the problem is the same for all tau considered. When running, however, the process keeps on getting slower with each increasing tau and my computer almost freezes at tau=60.



Besides, the .lst was growing so large and I checked on some online help and I found the recommendation that adding


$offsymxref offsymlist

$offlisting

option

limrow = 0,

limcol = 0,

solprint = off,

sysout = off;


could reduce the size of the .lst file and increase the speed.



The size of the .lst file was dramatically low when I followed the recommendations but the speed is still extremely slow for each increase in tau.



Any help is highly appreciated.



Best regards,

Yonas


To unsubscribe from this group and stop receiving emails from it, send an email to gamsworld+unsubscribe@googlegroups.com.
To post to this group, send email to gamsworld@googlegroups.com.
Visit this group at http://groups.google.com/group/gamsworld?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.

\

Dear Dax,

Thanks for the reply.
Here is a snippet of the code:

set
tau /tau1*tau365/
Cur(tau);
Cur(tau)=no;

.
.
.
Equations
Eq1(tau) Equation1
Eq2 Equation2
;

Eq1(Cur) … /*Equation definition
Eq2 … Z= sum(cur, z1(cur));

model /Eq1, Eq2/
loop (tau,
Cur(tau)=yes;
Cur(tau-1)=no;
solve problem using mip minizing Z;
);

This is basically what my model looks like.

To say more, it has three blocks that are optimized in succession (With output from one taken as input to the next block.). I also use the final output from the last block as an input for the next loop (tau).
In effect (at least that is what I understand) the speed should be the same for each tau. I say this because the problem is not growing with each loop, although that is what it feels like is happening in my case.

I am running the program in a server with 20 GB RAM. It reaches tau5 (5th loop in 1 min) then gradually slows and it has taken about 38 hours to reach the 194th loop. What is happening when I check the physical memory usage is, it reaches up to 99%.

I am not sure if what I wrote is understandable, but I can add more explanation or send the code for diagnosis.

Thanks for the help.

Yonas

P.S.
Below is what I copied from the .lst for different tau values


LOOPS tau tau1

MODEL STATISTICS

BLOCKS OF EQUATIONS 30 SINGLE EQUATIONS 232,480
BLOCKS OF VARIABLES 26 SINGLE VARIABLES 145,984
NON ZERO ELEMENTS 385,878 DISCRETE VARIABLES 33,456

GENERATION TIME = 1.404 SECONDS 118 Mb WEX236-236 Apr 6, 2011
EXECUTION TIME = 1.623 SECONDS 118 Mb WEX236-236 Apr 6, 2011

LOOPS tauo tau144

MODEL STATISTICS

BLOCKS OF EQUATIONS 30 SINGLE EQUATIONS 232,480
BLOCKS OF VARIABLES 26 SINGLE VARIABLES 145,984
NON ZERO ELEMENTS 385,878 DISCRETE VARIABLES 33,456


GENERATION TIME = 1.560 SECONDS 16,104 Mb WEX236-236 Apr 6, 2011
EXECUTION TIME = 1.731 SECONDS 16,104 Mb WEX236-236 Apr 6, 2011

LOOPS tau tau190


MODEL STATISTICS

BLOCKS OF EQUATIONS 30 SINGLE EQUATIONS 232,480
BLOCKS OF VARIABLES 26 SINGLE VARIABLES 145,984
NON ZERO ELEMENTS 385,878 DISCRETE VARIABLES 33,456

GENERATION TIME = 1.810 SECONDS 21,231 Mb WEX236-236 Apr 6, 2011
EXECUTION TIME = 2.075 SECONDS 21,231 Mb WEX236-236 Apr 6, 2011


On Tuesday, February 26, 2013 5:43:45 PM UTC+1, Yonas Gebrekiros wrote:

Dear GAMS users,

I am running a time series MIP for a planning period, (with the objective of minimizing total costs within tau). I have 365 planning periods and (tau=1 to tau=365). I want to do the optimization successively and I use looping to do so…

In principle, except for the changing parameters and initial conditions, the problem is the same for all tau considered. When running, however, the process keeps on getting slower with each increasing tau and my computer almost freezes at tau=60.



Besides, the .lst was growing so large and I checked on some online help and I found the recommendation that adding


$offsymxref offsymlist

$offlisting

option

limrow = 0,

limcol = 0,

solprint = off,

sysout = off;


could reduce the size of the .lst file and increase the speed.



The size of the .lst file was dramatically low when I followed the recommendations but the speed is still extremely slow for each increase in tau.



Any help is highly appreciated.



Best regards,

Yonas


To unsubscribe from this group and stop receiving emails from it, send an email to gamsworld+unsubscribe@googlegroups.com.
To post to this group, send email to gamsworld@googlegroups.com.
Visit this group at http://groups.google.com/group/gamsworld?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.

\

Dear Yonas,

Have you considered saving the solution and free the RAM after finding the solution?

regards,
Henry vermue

On Wednesday, February 27, 2013 3:44:09 PM UTC+1, Yonas Gebrekiros wrote:

Dear Dax,

Thanks for the reply.
Here is a snippet of the code:

set
tau /tau1*tau365/
Cur(tau);
Cur(tau)=no;

.
.
.
Equations
Eq1(tau) Equation1
Eq2 Equation2
;

Eq1(Cur) … /*Equation definition
Eq2 … Z= sum(cur, z1(cur));

model /Eq1, Eq2/
loop (tau,
Cur(tau)=yes;
Cur(tau-1)=no;
solve problem using mip minizing Z;
);

This is basically what my model looks like.

To say more, it has three blocks that are optimized in succession (With output from one taken as input to the next block.). I also use the final output from the last block as an input for the next loop (tau).
In effect (at least that is what I understand) the speed should be the same for each tau. I say this because the problem is not growing with each loop, although that is what it feels like is happening in my case.

I am running the program in a server with 20 GB RAM. It reaches tau5 (5th loop in 1 min) then gradually slows and it has taken about 38 hours to reach the 194th loop. What is happening when I check the physical memory usage is, it reaches up to 99%.

I am not sure if what I wrote is understandable, but I can add more explanation or send the code for diagnosis.

Thanks for the help.

Yonas

P.S.
Below is what I copied from the .lst for different tau values


LOOPS tau tau1

MODEL STATISTICS

BLOCKS OF EQUATIONS 30 SINGLE EQUATIONS 232,480
BLOCKS OF VARIABLES 26 SINGLE VARIABLES 145,984
NON ZERO ELEMENTS 385,878 DISCRETE VARIABLES 33,456

GENERATION TIME = 1.404 SECONDS 118 Mb WEX236-236 Apr 6, 2011
EXECUTION TIME = 1.623 SECONDS 118 Mb WEX236-236 Apr 6, 2011

LOOPS tauo tau144

MODEL STATISTICS

BLOCKS OF EQUATIONS 30 SINGLE EQUATIONS 232,480
BLOCKS OF VARIABLES 26 SINGLE VARIABLES 145,984
NON ZERO ELEMENTS 385,878 DISCRETE VARIABLES 33,456


GENERATION TIME = 1.560 SECONDS 16,104 Mb WEX236-236 Apr 6, 2011
EXECUTION TIME = 1.731 SECONDS 16,104 Mb WEX236-236 Apr 6, 2011

LOOPS tau tau190


MODEL STATISTICS

BLOCKS OF EQUATIONS 30 SINGLE EQUATIONS 232,480
BLOCKS OF VARIABLES 26 SINGLE VARIABLES 145,984
NON ZERO ELEMENTS 385,878 DISCRETE VARIABLES 33,456

GENERATION TIME = 1.810 SECONDS 21,231 Mb WEX236-236 Apr 6, 2011
EXECUTION TIME = 2.075 SECONDS 21,231 Mb WEX236-236 Apr 6, 2011


On Tuesday, February 26, 2013 5:43:45 PM UTC+1, Yonas Gebrekiros wrote:

Dear GAMS users,

I am running a time series MIP for a planning period, (with the objective of minimizing total costs within tau). I have 365 planning periods and (tau=1 to tau=365). I want to do the optimization successively and I use looping to do so…

In principle, except for the changing parameters and initial conditions, the problem is the same for all tau considered. When running, however, the process keeps on getting slower with each increasing tau and my computer almost freezes at tau=60.



Besides, the .lst was growing so large and I checked on some online help and I found the recommendation that adding


$offsymxref offsymlist

$offlisting

option

limrow = 0,

limcol = 0,

solprint = off,

sysout = off;


could reduce the size of the .lst file and increase the speed.



The size of the .lst file was dramatically low when I followed the recommendations but the speed is still extremely slow for each increase in tau.



Any help is highly appreciated.



Best regards,

Yonas


To unsubscribe from this group and stop receiving emails from it, send an email to gamsworld+unsubscribe@googlegroups.com.
To post to this group, send email to gamsworld@googlegroups.com.
Visit this group at http://groups.google.com/group/gamsworld?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.

\

Dear Henry,

Could you elaborate a little on what you mean by that.
What happens is when the program is done with all the looping, I export the results to Excel. I am not sure where the results are temporarily stored, though :slight_smile:

Best regards,
Yonas

On Wednesday, February 27, 2013 5:01:26 PM UTC+1, Henry JW Vermue wrote:

Dear Yonas,

Have you considered saving the solution and free the RAM after finding the solution?

regards,
Henry vermue

On Wednesday, February 27, 2013 3:44:09 PM UTC+1, Yonas Gebrekiros wrote:

Dear Dax,

Thanks for the reply.
Here is a snippet of the code:

set
tau /tau1*tau365/
Cur(tau);
Cur(tau)=no;

.
.
.
Equations
Eq1(tau) Equation1
Eq2 Equation2
;

Eq1(Cur) … /*Equation definition
Eq2 … Z= sum(cur, z1(cur));

model /Eq1, Eq2/
loop (tau,
Cur(tau)=yes;
Cur(tau-1)=no;
solve problem using mip minizing Z;
);

This is basically what my model looks like.

To say more, it has three blocks that are optimized in succession (With output from one taken as input to the next block.). I also use the final output from the last block as an input for the next loop (tau).
In effect (at least that is what I understand) the speed should be the same for each tau. I say this because the problem is not growing with each loop, although that is what it feels like is happening in my case.

I am running the program in a server with 20 GB RAM. It reaches tau5 (5th loop in 1 min) then gradually slows and it has taken about 38 hours to reach the 194th loop. What is happening when I check the physical memory usage is, it reaches up to 99%.

I am not sure if what I wrote is understandable, but I can add more explanation or send the code for diagnosis.

Thanks for the help.

Yonas

P.S.
Below is what I copied from the .lst for different tau values


LOOPS tau tau1

MODEL STATISTICS

BLOCKS OF EQUATIONS 30 SINGLE EQUATIONS 232,480
BLOCKS OF VARIABLES 26 SINGLE VARIABLES 145,984
NON ZERO ELEMENTS 385,878 DISCRETE VARIABLES 33,456

GENERATION TIME = 1.404 SECONDS 118 Mb WEX236-236 Apr 6, 2011
EXECUTION TIME = 1.623 SECONDS 118 Mb WEX236-236 Apr 6, 2011

LOOPS tauo tau144

MODEL STATISTICS

BLOCKS OF EQUATIONS 30 SINGLE EQUATIONS 232,480
BLOCKS OF VARIABLES 26 SINGLE VARIABLES 145,984
NON ZERO ELEMENTS 385,878 DISCRETE VARIABLES 33,456


GENERATION TIME = 1.560 SECONDS 16,104 Mb WEX236-236 Apr 6, 2011
EXECUTION TIME = 1.731 SECONDS 16,104 Mb WEX236-236 Apr 6, 2011

LOOPS tau tau190


MODEL STATISTICS

BLOCKS OF EQUATIONS 30 SINGLE EQUATIONS 232,480
BLOCKS OF VARIABLES 26 SINGLE VARIABLES 145,984
NON ZERO ELEMENTS 385,878 DISCRETE VARIABLES 33,456

GENERATION TIME = 1.810 SECONDS 21,231 Mb WEX236-236 Apr 6, 2011
EXECUTION TIME = 2.075 SECONDS 21,231 Mb WEX236-236 Apr 6, 2011


On Tuesday, February 26, 2013 5:43:45 PM UTC+1, Yonas Gebrekiros wrote:

Dear GAMS users,

I am running a time series MIP for a planning period, (with the objective of minimizing total costs within tau). I have 365 planning periods and (tau=1 to tau=365). I want to do the optimization successively and I use looping to do so…

In principle, except for the changing parameters and initial conditions, the problem is the same for all tau considered. When running, however, the process keeps on getting slower with each increasing tau and my computer almost freezes at tau=60.



Besides, the .lst was growing so large and I checked on some online help and I found the recommendation that adding


$offsymxref offsymlist

$offlisting

option

limrow = 0,

limcol = 0,

solprint = off,

sysout = off;


could reduce the size of the .lst file and increase the speed.



The size of the .lst file was dramatically low when I followed the recommendations but the speed is still extremely slow for each increase in tau.



Any help is highly appreciated.



Best regards,

Yonas


To unsubscribe from this group and stop receiving emails from it, send an email to gamsworld+unsubscribe@googlegroups.com.
To post to this group, send email to gamsworld@googlegroups.com.
Visit this group at http://groups.google.com/group/gamsworld?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.

\

Dear Yonas

Well, I don’t know exactly what GAMS stores, but most programs do not store zeros. How about setting the elements of variables related to previously solved problems to zero? Obviously Z does not depend on tau, but do this for the variables indexed with tau (e.g. z1).
About saving/exporting data: try to use GDX or just display the variables within the loop.

Of course, clearing and/or displaying data might not be a good idea after each iteration, but it is possible to do this – say – every 10 or 20 simulations.

Kind regards,
Henry Vermue

On Wednesday, February 27, 2013 5:30:23 PM UTC+1, Yonas Gebrekiros wrote:

Dear Henry,

Could you elaborate a little on what you mean by that.
What happens is when the program is done with all the looping, I export the results to Excel. I am not sure where the results are temporarily stored, though :slight_smile:

Best regards,
Yonas

On Wednesday, February 27, 2013 5:01:26 PM UTC+1, Henry JW Vermue wrote:

Dear Yonas,

Have you considered saving the solution and free the RAM after finding the solution?

regards,
Henry vermue

On Wednesday, February 27, 2013 3:44:09 PM UTC+1, Yonas Gebrekiros wrote:

Dear Dax,

Thanks for the reply.
Here is a snippet of the code:

set
tau /tau1*tau365/
Cur(tau);
Cur(tau)=no;

.
.
.
Equations
Eq1(tau) Equation1
Eq2 Equation2
;

Eq1(Cur) … /*Equation definition
Eq2 … Z= sum(cur, z1(cur));

model /Eq1, Eq2/
loop (tau,
Cur(tau)=yes;
Cur(tau-1)=no;
solve problem using mip minizing Z;
);

This is basically what my model looks like.

To say more, it has three blocks that are optimized in succession (With output from one taken as input to the next block.). I also use the final output from the last block as an input for the next loop (tau).
In effect (at least that is what I understand) the speed should be the same for each tau. I say this because the problem is not growing with each loop, although that is what it feels like is happening in my case.

I am running the program in a server with 20 GB RAM. It reaches tau5 (5th loop in 1 min) then gradually slows and it has taken about 38 hours to reach the 194th loop. What is happening when I check the physical memory usage is, it reaches up to 99%.

I am not sure if what I wrote is understandable, but I can add more explanation or send the code for diagnosis.

Thanks for the help.

Yonas

P.S.
Below is what I copied from the .lst for different tau values


LOOPS tau tau1

MODEL STATISTICS

BLOCKS OF EQUATIONS 30 SINGLE EQUATIONS 232,480
BLOCKS OF VARIABLES 26 SINGLE VARIABLES 145,984
NON ZERO ELEMENTS 385,878 DISCRETE VARIABLES 33,456

GENERATION TIME = 1.404 SECONDS 118 Mb WEX236-236 Apr 6, 2011
EXECUTION TIME = 1.623 SECONDS 118 Mb WEX236-236 Apr 6, 2011

LOOPS tauo tau144

MODEL STATISTICS

BLOCKS OF EQUATIONS 30 SINGLE EQUATIONS 232,480
BLOCKS OF VARIABLES 26 SINGLE VARIABLES 145,984
NON ZERO ELEMENTS 385,878 DISCRETE VARIABLES 33,456


GENERATION TIME = 1.560 SECONDS 16,104 Mb WEX236-236 Apr 6, 2011
EXECUTION TIME = 1.731 SECONDS 16,104 Mb WEX236-236 Apr 6, 2011

LOOPS tau tau190


MODEL STATISTICS

BLOCKS OF EQUATIONS 30 SINGLE EQUATIONS 232,480
BLOCKS OF VARIABLES 26 SINGLE VARIABLES 145,984
NON ZERO ELEMENTS 385,878 DISCRETE VARIABLES 33,456

GENERATION TIME = 1.810 SECONDS 21,231 Mb WEX236-236 Apr 6, 2011
EXECUTION TIME = 2.075 SECONDS 21,231 Mb WEX236-236 Apr 6, 2011


On Tuesday, February 26, 2013 5:43:45 PM UTC+1, Yonas Gebrekiros wrote:

Dear GAMS users,

I am running a time series MIP for a planning period, (with the objective of minimizing total costs within tau). I have 365 planning periods and (tau=1 to tau=365). I want to do the optimization successively and I use looping to do so…

In principle, except for the changing parameters and initial conditions, the problem is the same for all tau considered. When running, however, the process keeps on getting slower with each increasing tau and my computer almost freezes at tau=60.



Besides, the .lst was growing so large and I checked on some online help and I found the recommendation that adding


$offsymxref offsymlist

$offlisting

option

limrow = 0,

limcol = 0,

solprint = off,

sysout = off;


could reduce the size of the .lst file and increase the speed.



The size of the .lst file was dramatically low when I followed the recommendations but the speed is still extremely slow for each increase in tau.



Any help is highly appreciated.



Best regards,

Yonas


To unsubscribe from this group and stop receiving emails from it, send an email to gamsworld+unsubscribe@googlegroups.com.
To post to this group, send email to gamsworld@googlegroups.com.
Visit this group at http://groups.google.com/group/gamsworld?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.

\

Dear Henry,

Can you please write me some snippets of codes (for your ideas) of clearing RAM after finding the solution or what you mentioned about setting the variables related to previously solved problems to zero. Your ideas look right to me nut I am not sure how I can proceed in implementing them.

I appreciate your time and consideration.

Yonas

On Thursday, February 28, 2013 2:23:42 PM UTC+1, Henry JW Vermue wrote:

Dear Yonas

Well, I don’t know exactly what GAMS stores, but most programs do not store zeros. How about setting the elements of variables related to previously solved problems to zero? Obviously Z does not depend on tau, but do this for the variables indexed with tau (e.g. z1).
About saving/exporting data: try to use GDX or just display the variables within the loop.

Of course, clearing and/or displaying data might not be a good idea after each iteration, but it is possible to do this – say – every 10 or 20 simulations.

Kind regards,
Henry Vermue

On Wednesday, February 27, 2013 5:30:23 PM UTC+1, Yonas Gebrekiros wrote:

Dear Henry,

Could you elaborate a little on what you mean by that.
What happens is when the program is done with all the looping, I export the results to Excel. I am not sure where the results are temporarily stored, though :slight_smile:

Best regards,
Yonas

On Wednesday, February 27, 2013 5:01:26 PM UTC+1, Henry JW Vermue wrote:

Dear Yonas,

Have you considered saving the solution and free the RAM after finding the solution?

regards,
Henry vermue

On Wednesday, February 27, 2013 3:44:09 PM UTC+1, Yonas Gebrekiros wrote:

Dear Dax,

Thanks for the reply.
Here is a snippet of the code:

set
tau /tau1*tau365/
Cur(tau);
Cur(tau)=no;

.
.
.
Equations
Eq1(tau) Equation1
Eq2 Equation2
;

Eq1(Cur) … /*Equation definition
Eq2 … Z= sum(cur, z1(cur));

model /Eq1, Eq2/
loop (tau,
Cur(tau)=yes;
Cur(tau-1)=no;
solve problem using mip minizing Z;
);

This is basically what my model looks like.

To say more, it has three blocks that are optimized in succession (With output from one taken as input to the next block.). I also use the final output from the last block as an input for the next loop (tau).
In effect (at least that is what I understand) the speed should be the same for each tau. I say this because the problem is not growing with each loop, although that is what it feels like is happening in my case.

I am running the program in a server with 20 GB RAM. It reaches tau5 (5th loop in 1 min) then gradually slows and it has taken about 38 hours to reach the 194th loop. What is happening when I check the physical memory usage is, it reaches up to 99%.

I am not sure if what I wrote is understandable, but I can add more explanation or send the code for diagnosis.

Thanks for the help.

Yonas

P.S.
Below is what I copied from the .lst for different tau values


LOOPS tau tau1

MODEL STATISTICS

BLOCKS OF EQUATIONS 30 SINGLE EQUATIONS 232,480
BLOCKS OF VARIABLES 26 SINGLE VARIABLES 145,984
NON ZERO ELEMENTS 385,878 DISCRETE VARIABLES 33,456

GENERATION TIME = 1.404 SECONDS 118 Mb WEX236-236 Apr 6, 2011
EXECUTION TIME = 1.623 SECONDS 118 Mb WEX236-236 Apr 6, 2011

LOOPS tauo tau144

MODEL STATISTICS

BLOCKS OF EQUATIONS 30 SINGLE EQUATIONS 232,480
BLOCKS OF VARIABLES 26 SINGLE VARIABLES 145,984
NON ZERO ELEMENTS 385,878 DISCRETE VARIABLES 33,456


GENERATION TIME = 1.560 SECONDS 16,104 Mb WEX236-236 Apr 6, 2011
EXECUTION TIME = 1.731 SECONDS 16,104 Mb WEX236-236 Apr 6, 2011

LOOPS tau tau190


MODEL STATISTICS

BLOCKS OF EQUATIONS 30 SINGLE EQUATIONS 232,480
BLOCKS OF VARIABLES 26 SINGLE VARIABLES 145,984
NON ZERO ELEMENTS 385,878 DISCRETE VARIABLES 33,456

GENERATION TIME = 1.810 SECONDS 21,231 Mb WEX236-236 Apr 6, 2011
EXECUTION TIME = 2.075 SECONDS 21,231 Mb WEX236-236 Apr 6, 2011


On Tuesday, February 26, 2013 5:43:45 PM UTC+1, Yonas Gebrekiros wrote:

Dear GAMS users,

I am running a time series MIP for a planning period, (with the objective of minimizing total costs within tau). I have 365 planning periods and (tau=1 to tau=365). I want to do the optimization successively and I use looping to do so…

In principle, except for the changing parameters and initial conditions, the problem is the same for all tau considered. When running, however, the process keeps on getting slower with each increasing tau and my computer almost freezes at tau=60.



Besides, the .lst was growing so large and I checked on some online help and I found the recommendation that adding


$offsymxref offsymlist

$offlisting

option

limrow = 0,

limcol = 0,

solprint = off,

sysout = off;


could reduce the size of the .lst file and increase the speed.



The size of the .lst file was dramatically low when I followed the recommendations but the speed is still extremely slow for each increase in tau.



Any help is highly appreciated.



Best regards,

Yonas


To unsubscribe from this group and stop receiving emails from it, send an email to gamsworld+unsubscribe@googlegroups.com.
To post to this group, send email to gamsworld@googlegroups.com.
Visit this group at http://groups.google.com/group/gamsworld?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.

\

Hello Yonas,

Unfortunately it’s less clear than I’d thought.
Because I provided multiple solutions, I’ll give the most sensible one to implement (which is most likely not the fastest).

loop (tau,
Cur(tau)=yes;
Cur(tau-1)=no;
*also clear all previous variables that depend on tau.
z1(tau-1)=0;
…(tau-1,…)=0;

solve problem using mip minizing Z;
*now display what you otherwise would do after the loop
display z1.l, … ;
);

This will display your results at every step (which is needed, because you practically destroy any previous results).
Again, you could also display every 5 or 10 loops (with an if statement), but then you should be clear data in a different way. Another idea is to remove all dependencies on tau in your model, but with the given information I can’t make a fair judgment on that.

regards,
Henry

On Friday, March 1, 2013 10:57:32 AM UTC+1, Yonas Gebrekiros wrote:

Dear Henry,

Can you please write me some snippets of codes (for your ideas) of clearing RAM after finding the solution or what you mentioned about setting the variables related to previously solved problems to zero. Your ideas look right to me nut I am not sure how I can proceed in implementing them.

I appreciate your time and consideration.

Yonas

On Thursday, February 28, 2013 2:23:42 PM UTC+1, Henry JW Vermue wrote:

Dear Yonas

Well, I don’t know exactly what GAMS stores, but most programs do not store zeros. How about setting the elements of variables related to previously solved problems to zero? Obviously Z does not depend on tau, but do this for the variables indexed with tau (e.g. z1).
About saving/exporting data: try to use GDX or just display the variables within the loop.

Of course, clearing and/or displaying data might not be a good idea after each iteration, but it is possible to do this – say – every 10 or 20 simulations.

Kind regards,
Henry Vermue

On Wednesday, February 27, 2013 5:30:23 PM UTC+1, Yonas Gebrekiros wrote:

Dear Henry,

Could you elaborate a little on what you mean by that.
What happens is when the program is done with all the looping, I export the results to Excel. I am not sure where the results are temporarily stored, though :slight_smile:

Best regards,
Yonas

On Wednesday, February 27, 2013 5:01:26 PM UTC+1, Henry JW Vermue wrote:

Dear Yonas,

Have you considered saving the solution and free the RAM after finding the solution?

regards,
Henry vermue

On Wednesday, February 27, 2013 3:44:09 PM UTC+1, Yonas Gebrekiros wrote:

Dear Dax,

Thanks for the reply.
Here is a snippet of the code:

set
tau /tau1*tau365/
Cur(tau);
Cur(tau)=no;

.
.
.
Equations
Eq1(tau) Equation1
Eq2 Equation2
;

Eq1(Cur) … /*Equation definition
Eq2 … Z= sum(cur, z1(cur));

model /Eq1, Eq2/
loop (tau,
Cur(tau)=yes;
Cur(tau-1)=no;
solve problem using mip minizing Z;
);

This is basically what my model looks like.

To say more, it has three blocks that are optimized in succession (With output from one taken as input to the next block.). I also use the final output from the last block as an input for the next loop (tau).
In effect (at least that is what I understand) the speed should be the same for each tau. I say this because the problem is not growing with each loop, although that is what it feels like is happening in my case.

I am running the program in a server with 20 GB RAM. It reaches tau5 (5th loop in 1 min) then gradually slows and it has taken about 38 hours to reach the 194th loop. What is happening when I check the physical memory usage is, it reaches up to 99%.

I am not sure if what I wrote is understandable, but I can add more explanation or send the code for diagnosis.

Thanks for the help.

Yonas

P.S.
Below is what I copied from the .lst for different tau values


LOOPS tau tau1

MODEL STATISTICS

BLOCKS OF EQUATIONS 30 SINGLE EQUATIONS 232,480
BLOCKS OF VARIABLES 26 SINGLE VARIABLES 145,984
NON ZERO ELEMENTS 385,878 DISCRETE VARIABLES 33,456

GENERATION TIME = 1.404 SECONDS 118 Mb WEX236-236 Apr 6, 2011
EXECUTION TIME = 1.623 SECONDS 118 Mb WEX236-236 Apr 6, 2011

LOOPS tauo tau144

MODEL STATISTICS

BLOCKS OF EQUATIONS 30 SINGLE EQUATIONS 232,480
BLOCKS OF VARIABLES 26 SINGLE VARIABLES 145,984
NON ZERO ELEMENTS 385,878 DISCRETE VARIABLES 33,456


GENERATION TIME = 1.560 SECONDS 16,104 Mb WEX236-236 Apr 6, 2011
EXECUTION TIME = 1.731 SECONDS 16,104 Mb WEX236-236 Apr 6, 2011

LOOPS tau tau190


MODEL STATISTICS

BLOCKS OF EQUATIONS 30 SINGLE EQUATIONS 232,480
BLOCKS OF VARIABLES 26 SINGLE VARIABLES 145,984
NON ZERO ELEMENTS 385,878 DISCRETE VARIABLES 33,456

GENERATION TIME = 1.810 SECONDS 21,231 Mb WEX236-236 Apr 6, 2011
EXECUTION TIME = 2.075 SECONDS 21,231 Mb WEX236-236 Apr 6, 2011


On Tuesday, February 26, 2013 5:43:45 PM UTC+1, Yonas Gebrekiros wrote:

Dear GAMS users,

I am running a time series MIP for a planning period, (with the objective of minimizing total costs within tau). I have 365 planning periods and (tau=1 to tau=365). I want to do the optimization successively and I use looping to do so…

In principle, except for the changing parameters and initial conditions, the problem is the same for all tau considered. When running, however, the process keeps on getting slower with each increasing tau and my computer almost freezes at tau=60.



Besides, the .lst was growing so large and I checked on some online help and I found the recommendation that adding


$offsymxref offsymlist

$offlisting

option

limrow = 0,

limcol = 0,

solprint = off,

sysout = off;


could reduce the size of the .lst file and increase the speed.



The size of the .lst file was dramatically low when I followed the recommendations but the speed is still extremely slow for each increase in tau.



Any help is highly appreciated.



Best regards,

Yonas


To unsubscribe from this group and stop receiving emails from it, send an email to gamsworld+unsubscribe@googlegroups.com.
To post to this group, send email to gamsworld@googlegroups.com.
Visit this group at http://groups.google.com/group/gamsworld?hl=en.
For more options, visit https://groups.google.com/groups/opt_out.

\