Structuring of Excel output-file in GAMS

Dear,

I refer to this post: https://newforum.gams.com/t/help-with-gams-gdx-and-xlsx/2663/1

I have a problem with structure of Excel file - output file with variables having many dimensions.
I added as recommended:

rdim=5 cdim=1

. My code looks like that in the attachment.
Parameter “qabg” is six-dimensional.

I still get the same structure of Excel as I showed in the post.

The aim of it is to receive Excel file with the same structure as in *gdx file. Mainly, all dimensions should be in columns.

I hope you understand my problem.

Irena
Capture.JPG

Hi Jarenka
If I understand you correctly, you want to have the parameter written to the excel file with the 6 dimension in the columns. For this, you write cdim=6 (and no rdim or rdim=0).
Cheers
Renger

Hi Renger,

This is something that I also tried.
And I didn’t get any Excel output file.
See the attachment.
Capture.JPG

How many elements does qab has? If it is over 1 million rows, you can’t save it in Excel because of the maximum size of a sheet (otal number of rows and columns on a worksheet 1,048,576 rows by 16,384 columns).
Cheers
Renger

Hi Renger,

It has 758,609 x 19 = 14,413,571 rows (in .gdx file). It seems that no matter what, it automatically transforms the gdx-file into Excel file with 758,609 rows and 24 columns (5 dimensions + 19 years).
Where did you find this number?

This is good information to know, because we are working with variables that have many dimensions. In this case, I have to find a better solution to exporting variables and their presentation.

My task is to extract variables for our users, and present them in a user-friendly way.
Since the variables are in data set form, the users cannot work with them. If they had the variables in an excel file, they would play with ‘pivot’ table or some figures in the file, etc.

Do you know an interface, connected to GAMS, that can easily extract some variables (with many dimensions) and present them on figures, tables, etc? The point is to create a presentation or files, which users can easily understand and work with.


Jarenka

Hi Jarenka

I think PowerPivot (an add-in for Excel 2010 upwards), might be the solution. I haven’t worked with it, but I know it has no problems with over 150 million rows.
If the problem of importing the data from gdx would remain (even with PowerPivot), you could always, either

  • write a csv file from your Gams program,
  • send the data to MySQL or Access and use the data import from Excel.

Cheers
Renger
PS. THe number of around 1 million rows can be found here

Thank you Renger,

Concerning *csv files. Does it work in GAMS Studio?

Hi Jarenka
Yes, just use gdxdump (dumps to gdx or csv files).
Cheers
Renger

Hi again,

I have tried to open a *csv file with following commend (trying with different option with delimiter):

$call gdxdump csv_output.gdx output=Employment_qaeg.csv delim=comma symb=qaegT format=csv

And I always receive an uncompleted csv-file with error - see attachment.

Do I do something wrong, or is there another way to extract a full dimension parameter from a gdx file?

Irena
error.JPG

Hi Jarenka

You show an unrelated figure (showing the number of lines is too big). If you want help on the csv issue, see if you get the error if the parameter would only have a few lines in the csv.
Cheers
Renger

Hi Renger,

Yes, it works with a few lines for CSV-format.
So if I understand it correctly, csv-format has also limited number of rows. Right?

Jarenka

Not as far as I know. If I run the following Gams code, which generates 1000000 rows with 6 dimensions:

set t /1*10/;
alias(t,t2,t3,t4,t5,t6)
parameter test(t,t2,t3,t4,t5,t6);
test(t,t2,t3,t4,t5,t6) = uniform(0,10);
execute_unload 'results.gdx', test;

(be sure to use execute_unload otherwise the parameter test will not be saved in the gdx file):
And then I write this away in a csv file using the following code


parameter test;
$gdxin results.gdx
$load test
$call gdxdump results.gdx output=results.csv delim=comma symb=test format=csv

I get a csv file with 1000000 rows + 1 row for the heading.

So, no limit.

Cheers
Renger

Hi Renger,

1,000,000 rows is not a problem. The problem with csv starts when I have over 1,048,576 rows.
Can you try to extend you test-parameter to have over 1,048,576 rows?

Jarenka

Hi Jarenka
On my machine, I tried with 15^6 (> 11 million lines) and it works fine. Produces a CSV file of 450Mb.
Cheers
Renger

Hi Renger,

I tried your code first and seems I have a problem with gdx-file. See attachment:

Jarenka
code.JPG

Hi Jarenka

I used two separate gams files. If you put it all in one file, you don’t need to load test (and if you would do, you should use execute_load as the gdx file is produced in execution time and not available in compile time).
Just run:

set t /1*15/;
alias(t,t2,t3,t4,t5,t6)
parameter test(t,t2,t3,t4,t5,t6);

test(t,t2,t3,t4,t5,t6) = uniform(0,10);

execute_unload 'results.gdx', test;
$call gdxdump results.gdx output=test.csv delim=comma symb=test format=csv

Cheers
Renger

Yes. I corrected the code. There are no errors.
But the problems appears when I want to look at results.csv file. It shows only 1,048,576 rows (I increased dimensions in test-parameter) and no more.
code1.JPG
code.JPG

Yes, by the limit of excel. Now you should use powerpivot to import the data into excel. Then you will have the possibility to make a pivot table of your data (I haven’t used this, but checking powerpivot, I imported the data and produced a pivot table in 2 minutes)
image.png
Cheers
Renger

Everything would be perfect if I had PowerPivot :slight_smile:

Office 365 comes with PowerPivot. If you don’t have a current version, you could always import the data in Access (or MySQL, which is free).
You can, also in earlier versions of Excel, connect to a database, do queries and import these (so you don’t have to import all the data). Once you have imported your query /table you can make graphs, tables, etc. The main point is here that you still are bound by the around 1 million rows, but you probably will want to aggregate or filter this data anyway.
Cheers
Renger