How to force GAMS to NOT warm-start

Hello to all.
I am currently working on an NLP model which has to be run several times by changing only parameters. I want every new “iteration” (i.e. each time a new optimization starts with new parameter values) to initialize all variables with specific values fixed by me, and NOT by warm-starting the new optimization with the variable values obtained by the previous one, since I want to avoid local solutions close to the latter. If I’m not wrong, it can be done by including in the loop variable.l values before each optimization and by adding the option bratio=1;.

Nevertheless, in my final results I never obtain the same solution with the same value for the parameters (it has a unique global solution), On the other hand, by doing some manual tests (running e.g. 10 optimizations manually) the same solution is attained every time. The latter has made me think GAMS is changing my initialization somehow.

Any ideas??

Many thanks in advance.


To unsubscribe from this group and stop receiving emails from it, send an email to gamsworld+unsubscribe@googlegroups.com.
To post to this group, send email to gamsworld@googlegroups.com.
Visit this group at http://groups.google.com/group/gamsworld.
For more options, visit https://groups.google.com/d/optout.

Manuel: I will be reading this thread when other more experienced people speak, they sure have better solutions from which I will be able to learn too, but I know first hand how frustrating NLP can be, so I will give you a quick idea:

When solving in a loop, I clear the marginal of the variables also (because I have seen that it influences the solution obtained), by using option clear=var_name.m

Again, I’m not a GAMS expert, and I’m still missing a way to simply “start from scratch” in terms of level and marginal. For now, I have to clear every variable level and marginal by hand using “option clear”.

Hope it helps!

Claudio

On Wed, Nov 26, 2014 at 10:49 AM, Manuel Ramos wrote:

Hello to all.
I am currently working on an NLP model which has to be run several times by changing only parameters. I want every new “iteration” (i.e. each time a new optimization starts with new parameter values) to initialize all variables with specific values fixed by me, and NOT by warm-starting the new optimization with the variable values obtained by the previous one, since I want to avoid local solutions close to the latter. If I’m not wrong, it can be done by including in the loop variable.l values before each optimization and by adding the option bratio=1;.

Nevertheless, in my final results I never obtain the same solution with the same value for the parameters (it has a unique global solution), On the other hand, by doing some manual tests (running e.g. 10 optimizations manually) the same solution is attained every time. The latter has made me think GAMS is changing my initialization somehow.

Any ideas??

Many thanks in advance.


To unsubscribe from this group and stop receiving emails from it, send an email to gamsworld+unsubscribe@googlegroups.com.
To post to this group, send email to gamsworld@googlegroups.com.
Visit this group at http://groups.google.com/group/gamsworld.
For more options, visit https://groups.google.com/d/optout.

\

To unsubscribe from this group and stop receiving emails from it, send an email to gamsworld+unsubscribe@googlegroups.com.
To post to this group, send email to gamsworld@googlegroups.com.
Visit this group at http://groups.google.com/group/gamsworld.
For more options, visit https://groups.google.com/d/optout.

Claudio,

Your frustration with how to start a solve from a consistent state is one I’ve also experienced in the past. Fortunately, this issue can be addressed with the GAMS savepoint and loadpoint functionality. These commands are covered in the documentation and probably a few times on this list so I won’t repeat that here. However, a working example (building on circle from gamslib) is attached to illustrate. Let me know if it fails to work or is unclear, etc.

-Steve

On Wed, Nov 26, 2014 at 11:38 AM, Claudio Delpino wrote:

Manuel: I will be reading this thread when other more experienced people speak, they sure have better solutions from which I will be able to learn too, but I know first hand how frustrating NLP can be, so I will give you a quick idea:

When solving in a loop, I clear the marginal of the variables also (because I have seen that it influences the solution obtained), by using option clear=var_name.m

Again, I’m not a GAMS expert, and I’m still missing a way to simply “start from scratch” in terms of level and marginal. For now, I have to clear every variable level and marginal by hand using “option clear”.

Hope it helps!

Claudio

On Wed, Nov 26, 2014 at 10:49 AM, Manuel Ramos wrote:

Hello to all.
I am currently working on an NLP model which has to be run several times by changing only parameters. I want every new “iteration” (i.e. each time a new optimization starts with new parameter values) to initialize all variables with specific values fixed by me, and NOT by warm-starting the new optimization with the variable values obtained by the previous one, since I want to avoid local solutions close to the latter. If I’m not wrong, it can be done by including in the loop variable.l values before each optimization and by adding the option bratio=1;.

Nevertheless, in my final results I never obtain the same solution with the same value for the parameters (it has a unique global solution), On the other hand, by doing some manual tests (running e.g. 10 optimizations manually) the same solution is attained every time. The latter has made me think GAMS is changing my initialization somehow.

Any ideas??

Many thanks in advance.


circle.gms (1.66 KB)

Claudio, thank you very much vor your advice. It seems it is the only way to achieve this (at least for now).


On Wednesday, November 26, 2014 5:42:45 PM UTC+1, Claudio Delpino wrote:

Manuel: I will be reading this thread when other more experienced people speak, they sure have better solutions from which I will be able to learn too, but I know first hand how frustrating NLP can be, so I will give you a quick idea:

When solving in a loop, I clear the marginal of the variables also (because I have seen that it influences the solution obtained), by using option clear=var_name.m

Again, I’m not a GAMS expert, and I’m still missing a way to simply “start from scratch” in terms of level and marginal. For now, I have to clear every variable level and marginal by hand using “option clear”.

Hope it helps!

Claudio

On Wed, Nov 26, 2014 at 10:49 AM, Manuel Ramos wrote:

Hello to all.
I am currently working on an NLP model which has to be run several times by changing only parameters. I want every new “iteration” (i.e. each time a new optimization starts with new parameter values) to initialize all variables with specific values fixed by me, and NOT by warm-starting the new optimization with the variable values obtained by the previous one, since I want to avoid local solutions close to the latter. If I’m not wrong, it can be done by including in the loop variable.l values before each optimization and by adding the option bratio=1;.

Nevertheless, in my final results I never obtain the same solution with the same value for the parameters (it has a unique global solution), On the other hand, by doing some manual tests (running e.g. 10 optimizations manually) the same solution is attained every time. The latter has made me think GAMS is changing my initialization somehow.

Any ideas??

Many thanks in advance.


To unsubscribe from this group and stop receiving emails from it, send an email to gamsworld+...@googlegroups.com.
To post to this group, send email to gams...@googlegroups.com.
Visit this group at http://groups.google.com/group/gamsworld.
For more options, visit https://groups.google.com/d/optout.

\

To unsubscribe from this group and stop receiving emails from it, send an email to gamsworld+unsubscribe@googlegroups.com.
To post to this group, send email to gamsworld@googlegroups.com.
Visit this group at http://groups.google.com/group/gamsworld.
For more options, visit https://groups.google.com/d/optout.