Hi Jarenka
If I understand you correctly, you want to have the parameter written to the excel file with the 6 dimension in the columns. For this, you write cdim=6 (and no rdim or rdim=0).
Cheers
Renger
How many elements does qab has? If it is over 1 million rows, you can’t save it in Excel because of the maximum size of a sheet (otal number of rows and columns on a worksheet 1,048,576 rows by 16,384 columns).
Cheers
Renger
It has 758,609 x 19 = 14,413,571 rows (in .gdx file). It seems that no matter what, it automatically transforms the gdx-file into Excel file with 758,609 rows and 24 columns (5 dimensions + 19 years).
Where did you find this number?
This is good information to know, because we are working with variables that have many dimensions. In this case, I have to find a better solution to exporting variables and their presentation.
My task is to extract variables for our users, and present them in a user-friendly way.
Since the variables are in data set form, the users cannot work with them. If they had the variables in an excel file, they would play with ‘pivot’ table or some figures in the file, etc.
Do you know an interface, connected to GAMS, that can easily extract some variables (with many dimensions) and present them on figures, tables, etc? The point is to create a presentation or files, which users can easily understand and work with.
I think PowerPivot (an add-in for Excel 2010 upwards), might be the solution. I haven’t worked with it, but I know it has no problems with over 150 million rows.
If the problem of importing the data from gdx would remain (even with PowerPivot), you could always, either
write a csv file from your Gams program,
send the data to MySQL or Access and use the data import from Excel.
Cheers
Renger
PS. THe number of around 1 million rows can be found here
You show an unrelated figure (showing the number of lines is too big). If you want help on the csv issue, see if you get the error if the parameter would only have a few lines in the csv.
Cheers
Renger
Not as far as I know. If I run the following Gams code, which generates 1000000 rows with 6 dimensions:
set t /1*10/;
alias(t,t2,t3,t4,t5,t6)
parameter test(t,t2,t3,t4,t5,t6);
test(t,t2,t3,t4,t5,t6) = uniform(0,10);
execute_unload 'results.gdx', test;
(be sure to use execute_unload otherwise the parameter test will not be saved in the gdx file):
And then I write this away in a csv file using the following code
1,000,000 rows is not a problem. The problem with csv starts when I have over 1,048,576 rows.
Can you try to extend you test-parameter to have over 1,048,576 rows?
I used two separate gams files. If you put it all in one file, you don’t need to load test (and if you would do, you should use execute_load as the gdx file is produced in execution time and not available in compile time).
Just run:
set t /1*15/;
alias(t,t2,t3,t4,t5,t6)
parameter test(t,t2,t3,t4,t5,t6);
test(t,t2,t3,t4,t5,t6) = uniform(0,10);
execute_unload 'results.gdx', test;
$call gdxdump results.gdx output=test.csv delim=comma symb=test format=csv
Yes. I corrected the code. There are no errors.
But the problems appears when I want to look at results.csv file. It shows only 1,048,576 rows (I increased dimensions in test-parameter) and no more.
Yes, by the limit of excel. Now you should use powerpivot to import the data into excel. Then you will have the possibility to make a pivot table of your data (I haven’t used this, but checking powerpivot, I imported the data and produced a pivot table in 2 minutes)
Office 365 comes with PowerPivot. If you don’t have a current version, you could always import the data in Access (or MySQL, which is free).
You can, also in earlier versions of Excel, connect to a database, do queries and import these (so you don’t have to import all the data). Once you have imported your query /table you can make graphs, tables, etc. The main point is here that you still are bound by the around 1 million rows, but you probably will want to aggregate or filter this data anyway.
Cheers
Renger