Dear support team,
I am working on an assignment problem in which the output would be of assigning tasks to the resources w.r.t minimizing the number of resources. Suppose the following output:
m1 m2 m3 m4
item1 1.000
item2 1.000
item3 1.000
item4 1.000
item5 1.000
item6 1.000
item7 1.000
item8 1.000
item9 1.000
item10 1.000
item11 1.000
item12 1.000
where indices m’s referred to the resources. Now, I would like to know how we can show the results as a readable output like:
m1 : {item4, item10}
...
m4: {item2, item8, item9}
I have tried to define a parameter sequence(m,j), and the output is something like as follows:
item1 item2 item3 item4 item5 item6 item7 item8 item9 item10 item11 item12
m1 4.000 10.000
m2 1.000 5.000 6.000 12.000
m3 3.000 7.000 11.000
m4 2.000 8.000 9.000
I need to determine the sequence of jobs on each machine in which we have a parameter/set to calculate the start and finish times of each job on each machine. For some reason, I do not want to use the decision variables and extra constraints to capture what we are looking for.
As an example, in the above sequence, there are two jobs that are processed on the resource m1. Items 4, 10 respectively. And also it would be considered item4 should be processed before item10. This is a general rule to show the order of jobs that need to be processed on each machine. For the above table:
m1: {1: item4, 2: item10}
...
m4:{1: item2, 2: item8, 3: item9}
So, the start time of item4 would be 0 and it would be finished at 0+duration time(4). Then the start time of item10 would be the finished time of item4 and its finished time is corresponding to its start time + duration time (10). Unfortunately, I cannot find any appropriate way to show this. So I was wondering if, is there any way to determine the sequence of jobs in each machine in a parametrized/algorithmic manner?
Indeed, GAMS comes with an embedded python API, (but I do not have much experience with that) and it would be worth if it may help to reach what I want.
Best regards
Abbas