Hi all,

I am trying to solve an LP model with GAMS/CPLEX. The model returns a solution when I run it with a small set (shorter time period). But I would like to simulate for one year, which will increase the size of my time set. When I run the model with the larger set, model generation time increases by a lot which should be reasonable to expect.

Up to now, I have faced lots of “Out of memory” errors, and I decided to simplify a bit my problem, by removing some constraints. I finally succeeded in running a one year simulation, almost reaching 32GB of RAM. I eventually got the following output:

55,503,501 rows 14,348,951 columns 111,313,503 non-zeroes

…

LP Presolve eliminated 52,779,141 rows and 10,792,349 columns.

Reduced LP has 2,724,360 rows, 3,556,602 columns, and 9,276,840 nonzeros.

Presolve time = 28.13 sec. (16398.93 ticks)

Compression time = 83.91 sec. (0.00 ticks)

Compression reduced 2299.45 MB A matrix to 567.77 MB

Is it normal if presolving eliminates 90% of my rows and columns or have I made mistakes in the way I modeled the problem? What can I do to track what equations or variables are eliminated, and the way I can provide a tighter formulation to the solver? By doing so, I am expecting less computation time and possibly to add back the constraints I removed.

Thank you.

Best regards,

Milien