extensions of the basic model chapter 6 elements of sequencing and scheduling by kenneth r. baker...
TRANSCRIPT
Extensions of the Basic Model
Chapter 6
Elements of Sequencing and Schedulingby Kenneth R. Baker
Byung-Hyun Ha
R1
2
Outline
Introduction
Nonsimultaneous arrivals Minimizing the makespan Minimizing maximum tardiness Other measures of performance
Dependent jobs Minimizing maximum tardiness Minimizing total flowtime with strings Minimizing total flowtime with parallel chains
Sequence-dependent setup times Dynamic programming solutions Branch and bound solutions Heuristic solutions
Summary
3
Introduction
Basic single-machine model Assumptions
C1. A set of n independent, single operation jobs is available simultaneously (at time zero).
C2. Setup times for the jobs are independent of job sequence and are included in processing times.
C3. Job descriptors are deterministic and known in advance.
C4. One machine is continuously available and never kept idle while work is waiting.
C5. Once an operation begins, it proceeds without interruption.
An opportunity to study a variety of scheduling criteria as well as a number of solution techniques
Possible generalization (in this chapter) C1 by non-simultaneous jobs arrival C1 by dependent job sets C2 by sequence-dependent setup C3 by use of probabilistic methods
4
Nonsimultaneous Arrivals
Static version of single-machine problem All jobs are simultaneously available for processing e.g., 1 || Cj , 1 || wjUj
Dynamic version Allowing different ready times (rj)
Examples of scheduling with ready times• Basic model (C4 and C5 are regarded)
• Inserted idle time is allowed -- 1 | rj | Tj
• With job preemption allowed (preempt-resume mode) -- 1 | rj , prmp | Tj
Job j 1 2
rj
pj
dj
057
1241 2
5 7
12
3 81
12
31
1
7
T1* = 3
T2* = 1
T3* = 0
5
Nonsimultaneous Arrivals
Scheduling with preemption allowed Preempt-resume mode
• Schedules without inserted idle time constitute a dominant set for regular measures.
• Properties associated with transitive rules are often essentially unchanged• Dispatching as decision making is possible for optimality -- no look-ahead
• Example -- 1 | rj , prmp | Tmax
» Keep machine assigned to the available job with the earliest due date
• Example -- 1 | rj , prmp | Cj
» keep machine assigned to the available job with the minimum remaining processing time (SRPT: shortest remaining processing time)
Preempt-repeat mode• Job must be restarted each time it is interrupted• Schedules without preemption (permutation schedules) constitute a dominant s
et• Inserted idle time is determined uniquely by choice of permutation
• Need to use look-ahead information (which makes solution approach complex)
6
Nonsimultaneous Arrivals
Minimizing the makespan 1 | rj | Cmax
• Makespan -- denoted by M or Cmax
• Related to throughput of schedule
Cmax is always constant in basic model (1 || Cmax)
M is minimized by Earliest Ready Time (ERT) rule• A nondelay dispatching procedure
• Yielding blocks of jobs
Generalization of the problem Each job with delivery time, qj
• Delivery takes place immediately after job complete, in parallel• Makespan includes delivery time• Symmetric property -- equivalent to reversed problem
1 2 3
tail of 1
tail of 2
tail of 3
M = C2 + q2
tails (delivery)
7
Nonsimultaneous Arrivals
Generalization of the problem (cont’d) Head-body-tail problem
• Job specification with triples (rj, pj, qj)
• NP-hard -- equivalent to 1 | rj | Lmax (discussed later)
A good heuristic solution• Nondelay dispatching procedure that always selects the available job with the
largest tail qj
ALGORITHM 1 -- The Largest Tail (LT) Procedure1. Initially, let t = 0.
2. If there are no unscheduled jobs at time t, set t equal to the minimum ready time among unscheduled jobs; otherwise, proceed.
3. Find job j with the largest qj among unscheduled jobs available at time t. Schedule j to begin at time t.
4. Increase t by pj . If all n jobs are scheduled, stop; otherwise return to Step 2.
Exercise -- head-body-tail problem with 5 jobs• Algorithm 1 should be executed twice (why?)
Job j 1 2 3 4 5
rj0 2 3 0 6
pj2 1 2 3 2
qj5 2 6 3 1
8
Nonsimultaneous Arrivals
Generalization of the problem (cont’d) Optimality condition
• Makespan M = ri + j=i..k pj + qk ,
• for some job i that initiates a block and for some k in the block called the critical job (jobs are renumbered according to sequence)
• If qk qj for all jobs j from i to k, M is optimal. (Theorem 1, discussed later)
It is sufficient condition, i.e., if not, it may not be optimal.
i i + 1 k... ... ......
tail of i
tail of k
a block
......
......
M = Ck + qk
critical job
9
Nonsimultaneous Arrivals
Minimizing maximum tardiness 1 | rj | Lmax
• Strongly NP-hard (p. 44 of Pinedo, 2008)
• 3-PARTITION reduces to 1 | rj | Lmax
EDD solves 1 || Lmax
Equivalence to head-body-tail problem• Let qj = D – dj , where D = max{dj}
• min. Lmax = max{Cj – dj} = max{Cj – (D – qj)} = max{Cj + qj} – D
Theorem 1• In the dynamic Lmax-problem, a non-delay implementation of the EDD rule yield
s
Lmax = ri + j=i..k pj – dk
for some job i that initiates a block, and for some job k in the same block, where the jobs are numbered in order of appearance in the schedule. If dk dj for all jobs j from i to k, then Lmax is optimal.
Proof of Theorem 1• Relaxation by considering only jobs from i to k with ready times as ri
10
Nonsimultaneous Arrivals
Other measures of performance Mostly NP-hard, if not preempt-resume mode
• e.g., 1 | rj | Lmax , 1 | rj | Cj , 1 | rj | Uj (then, how about 1 | rj | Tj ?)
• Preempt-resume mode problem as lower bound for branch-and-bound
• Not clear in case of 1 | rj | Uj or 1 | rj | Tj
1 | rj | Lmax
• Theorem 2
• In the dynamic Lmax-problem, suppose that the nondelay implementation of EDD yields a sequence of the jobs in EDD order. Then this nondelay schedule is optimal.
• Proof of Theorem 2• Relaxation by all ready times as zero
• Theorem 3
• In the dynamic Lmax-problem, if the ready times and due dates are agreeable, then the nondelay implementation of EDD is optimal.
11
Nonsimultaneous Arrivals
Other measures of performance (cont’d) 1 | rj | Cj
• Theorem 4• In the dynamic F-problem, if the ready times and processing times are a
greeable, then the nondelay implementation of SPT is optimal.• Some heuristics for general cases
• Nondelay adaptation of SPT• First Off First On (FOFO) rule
» Exploiting look-ahead information (so, not dispatching)
• Priority to job with smallest sum of earliest start time (rj) and earliest finish time (rj + pj), i.e., (2rj + pj)
1 | rj | Tj
• Theorem 5• In the dynamic T-problem, if the ready times, processing times and due
dates are all agreeable, the nondelay implementation of MDD is optimal.
1 | rj | Uj
• ALGORITHM 2 -- Minimizing U (Dynamic Version)• Optimal in case of agreeable ready times and due dates
12
Dependent Jobs
Constraints in scheduling Machine capacity (in basic model)
+ Technical restriction -- specification by admissible sequence of two jobs• Reduction of the set of feasible solutions
Dominance between jobs
Precedence constraints, i j Job j is not permitted to begin until job i is complete Job i is predecessor of job j, job j is successor of job i
Direct predecessor, direct successor
Example -- 1 | prec | Cj
• Three jobs a, b, c with pa pb pc
• Optimal without precedence: a-b-c• With additional precedence c a
• Clearly, c-b-a is not (why?)• Then, c-a-b or b-c-a?
13
Dependent Jobs
Minimizing maximum tardiness 1 | rj , prec | Lmax -- NP-hard (why?)
• Apply any optimization approach for 1 | rj | Lmax , after the following revision
• Dominance property
» Job j follows job i in an optimal sequence if ri rj and di dj .
• For each precedence i j, revise rj and di to rj' and di' such that
» rj' = max{rj , ri + pi} and di' = min{di , dj – pj}
» i.e., make agreeable ready times and due dates consistent with the precedence
It is not necessary to design new algorithm for this special case
• Justification of the revision -- when di dj – pj
• Li = Ci – di' = Ci – (dj – pj) (Cj – pj) – (dj – pj) = Cj – dj = Lj
1 | prec | Lmax
• Revise only due dates, and apply EDD
1 | prec | gj(Cj) -- extension of Theorem 1 of Ch. 3
• When the objective is to minimize the maximum penalty, job i may be assigned the last position in sequence if job i has no unscheduled successors and g
i(P) gj(P) for all jobs j i.
14
Dependent Jobs
1 | prec | Cj
Strongly NP-hard for arbitrary precedence structure Some special cases with existing polynomial-time algorithm
• Precedence structure with strings and chains
Minimizing total flowtime with strings String
• A set of jobs that must appear together (continuously) and in a fixed order• e.g., 4 jobs with a string (1-2-3)
• Only two possible sequences: 1-2-3-4 or 4-1-2-3
Some applications• Conflict between sorting and precedence constraints
• e.g., Single relevant precedence constraints i j but pj pi
» j is preferred to i for F criterion» There exists an optimal sequence in which jobs i and j are adjacen
t, in that order (why?)• Contiguity constraint, e.g., group of jobs with common major setup• Chains and series-parallel network (discussed next)
15
Dependent Jobs
Minimizing total flowtime with strings (cont’d) Problem with s strings
• nk -- number of jobs in string k (1 k s)
• pkj -- processing time of job j in string k (1 j nk)
Let
• pk = j=1..nk pkj -- total processing time in string k
• F(k, j) -- flowtime of job j in string k
• F(k) = F(k, nk) -- flowtime of string k
Objective -- to minimize the total flowtime (of jobs)
• Min. F = k=1..s j=1..nk F(k, j)
Theorem 6• In the single-machine problem with job strings, total flowtime is minimized by se
quencing the strings in the order
p[1]/n[1] p[2]/n[2] ... p[s]/n[s]
Proof of Theorem 6
• F = k=1..s j=1..nk F(k, j) = k=1..s j=1..nk (F(k) – i=(j+1)..nk pki)
F = k=1..s j=1..nk F(k) – k=1..s j=1..nk i=(j+1)..nk pki = k=1..s nkF(k) – c
16
Dependent Jobs
Minimizing total flowtime with parallel chains Chain
• Precedence structure in which each job has at most one direct predecessor and one direct successor
• The jobs in a chain do not necessarily have to be sequenced contiguously
The jobs in a string should be• Example with 9 jobs
• Feasible sequences: 4-1-2-3-7-8-9-5-6, 7-1-4-2-5-6-3-8-9, ...
ALGORITHM 3 -- Parallel Chain Algorithm for F1. Initially, each job is a string.
2. Find a pair of string, u and v, such that u directly precedes v and pv /nv pu /nu . Replace the pair by the string (u, v). Then repeat this step. When no such pair can be found, proceed to step 3.
3. Sort the strings in non-decreasing order of p/n. This is an optimal schedule.
Justification of Algorithm 3• Extended from Theorem 6 and related analysis
1 2 3 4 5 6 7 8 910 4 6 5 7 1 8 4 7
17
Dependent Jobs
Minimizing total flowtime with parallel chains (cont’d) Series-parallel precedence structure
• Network N with a single node, or that can be partitioned into two subnetworks N1 and N2 which are themselves series-parallel and where either:
• N1 is in series with N2 (if i N1 and j N2 , then i j), or
• N1 is in parallel with N2 (if i N1 and j N2 , then i j and j i)
• Example with 8 jobs
• Optimal sequence construction -- recursively apply the following from leaves:
• Series type N: forming string (N1, N2)
• Parallel type N: applying Algorithm 3
15
2
4
5
3
6
7
8
7 5
4 7
8 2
3
S
8
S
P
1
S
S
P
P
3
2
7
6
5
4
series-parallelstructure
decompositiontree
18
Sequence-dependent Setup Times
1 | sjk | Cmax
Setup time that cannot be absorbed in a job’s processing time Examples
• Production of different chemical compounds, colors of paint, strengths of detergent, blends of fuels (with cleansing required for switching)
• Process line for four types of gasoline
• Setup times -- sij matrix
• Makespan
• 1-2-3-4-1: p1 + 30 + p2 + 20 + p3 + 60 + p4 + 20 = j pj + 30 + 20 + 60 + 20
• 1-2-4-3-1: p1 + 30 + p2 + 80 + p3 + 10 + p4 + 30 = j pj + 30 + 80 + 10 + 30
• ...
Objective -- to minimize makespan to minimize total setup times• Min. M = F[n] + s[n],[n+1] = j=1..n+1 s[j –1],[j] + j=1..n pj min. j=1..n+1 s[j –1],[j]
(1) (2) (3) (4)
Racing (1) – 30 50 90
Premium (2) 40 – 20 80
Regular (3) 30 30 – 60
Unleaded (4) 20 15 10 –
19
Sequence-dependent Setup Times
1 | sjk | Cmax (cont’d) Strongly NP-hard
• Traveling salesman problem (TSP) reduces to 1 | sjk | Cmax .
TSP -- mathematical programming model• Decision variables
• xij = 1, if path (i, j) is part of a tourxij = 0, otherwise
• Objective
• z = i j (i) sij xij
• Constraints
• xij’s make a tour
Representation of a solution by selection of paths• Cost of solution -- length of a tour
• Sum of length of selected paths• Example
• x12 = x24 = x43
= x31 = 1
• 0, others
1 2
4 3
30
40
2030
10
60
9020
50
80
3015
graph representation
1 2 3 4
1 – 30 50 90
2 40 – 20 80
3 30 30 – 60
4 20 15 10 –
1 2 3 4
1 – X
2 – X
3 X –
4 X –
tour 1-2-4-3
1 2 3 4
1 – 30 50 90
2 40 – 20 80
3 30 30 – 60
4 20 15 10 –
length: 150
1 2
4 3
30
40
2030
10
60
9020
50
80
3015
setup times sij
20
Sequence-dependent Setup Times
Dynamic programming solutions Let
• n -- number of cities• X -- the set of all cities• J -- a subset of X• i -- arbitrarily chosen origin of tour
A representation of optimal tour• Sequence of sets {i}, S, {k}, J, {i}
• where i k, S J = , |S| + |J| = n – 2, {i, k} S J = X• Tour begins at city i, proceeded to cities in S, visits city k, then proceeds to ci
ties in J, and finally returns to i.
Formulation• f(k, J) = minjJ {skj + f(j, J – {j})}
• The length of the shortest path from city k that passes through the cities in J and finishes at city i
• f(k, ) = ski -- base case
• f(i, X) -- the length of the optimal tour
21
Sequence-dependent Setup Times
Branch and bound solutions Branching scheme
• Creating two subproblems at all levels• One containing a specific path constrained to be part of the solution• The other subproblem prohibiting that same path
• e.g., a partition by solutions with (2,1) and solutions without (2,1), ...
P
21 *21
1 2 3 41 – ? ? ?2 X –3 ? – ?4 ? ? –
21,34 21,*34 *21,23 *21,*23
1 2 3 41 – ? ? ?2 – ? ?3 ? ? – ?4 ? ? ? –
1 2 3 41 – ? ?2 X –3 – X4 ? ? –
1 2 3 41 – ? ? ?2 ? – ? ?3 ? ? – ?4 ? ? ? –
1 2 3 41 – ? ? ?2 X –3 ? –4 ? ? –
1 2 3 41 – ? ?2 – X3 ? ? – ?4 ? ? –
1 2 3 41 – ? ? ?2 – ?3 ? ? – ?4 ? ? ? –
1 2 3 41 – X2 X –3 X –4 X –
1 2 3 41 – ? ?2 – X3 ? ? –4 ? ? –
1 2 3 41 – ? ?2 X –3 ? – ?4 ? ? –
1 2 3 41 – X2 X –3 – X4 X –
1 2 3 41 – ? ?2 – X3 ? – ?4 ? ? –
22
Sequence-dependent Setup Times
Branch and bound solutions (cont’d) Reduction of sij matrix
• Subtracting the minimum row elements from each row• Subtracting the minimum column elements from each column
Lower bound• Sum of the subtraction constants for reduction
Better one by solving (relaxed) assignment problem
Example• Root node (original problem) reduction: LB = 20
• z = i j (i) sij xij = i j (i) s'ij xij + 4 + 5 + 4 + 2 + 5 = i j (i) s'ij xij + 20
Selection of path with x12 = x21 = x35 = x43 = x54 = 1?
i j (i) sij xij = i j (i) s'ij xij + 20 = 20
» Optimal? No! Subtour!
P -- sij
– 4 8 6 85 – 7 11 13
11 6 – 8 45 7 2 – 2
10 9 7 5 –
P (reduced) -- s'ij– 0 4 2 40 – 2 6 87 2 – 4 03 5 0 – 05 4 2 0 –
P (reduced)– XX –
– XX –
X –
23
Sequence-dependent Setup Times
Branch and bound solutions (cont’d) Justification of reduction and lower bound
• Exactly one element in each row is contained in a solution• Exactly one element in each column is contained in a solution• Lower bound as the distance that are unavoidable in any solution
Example of reduction• Original and reduced length (LB = 120)
• Analysis in perspective of node 4
+ +
original– 30 50 90
40 – 20 8030 30 – 6020 15 10 –
reduced– 0 20 30
20 – 0 300 0 – 0
10 5 0 –
row reduction– 30 30 30
20 – 20 2030 30 – 3010 10 10 –
column reduction– 0 0 300 – 0 300 0 – 300 0 0 –
42
1
3
30
50
9040
80
20 60
30
30
20
15
10
42
1
3
0
20
6020
60
0 30
0
0
10
5
0
42
1
3
0
20
3020
30
0 0
0
0
10
5
0
row reductionby 30+20+30+10
column reductionby 0+0+0+30
24
Sequence-dependent Setup Times
Branch and bound solutions (cont’d) Branching scheme (cont’d)
• Selection of a zero element in reduced matrix for two subproblems (why?)• A possible element selection criterion
• One that would permit the largest possible reduction next, when prohibited
P
20
P– 4 8 6 85 – 7 11 13
11 6 – 8 45 7 2 – 2
10 9 7 5 –
P (reduced by 20)– 0 4 2 40 – 2 6 87 2 – 4 03 5 0 – 05 4 2 0 –
P (reduced by 20)– 40 4 2 4
50 – 2 6 87 2 – 4 203 5 20 – 005 4 2 40 –
21 *21
P(21)– – 4 2 4
*0 – – – –– 2 – 4 0– 5 0 – 0– 4 2 0 –
P(21)– 0 4 2 4– – 2 6 87 2 – 4 03 5 0 – 05 4 2 0 –
P(21) (reduced by 5)– 40 4 2 4– – 40 4 64 2 – 4 20
20 5 00 – 002 4 2 40 –
25P(21) (reduced by 4)– – 2 20 2
*0 – – – –– 20 – 4 00– 3 20 – 00– 2 2 20 –
24
25
Sequence-dependent Setup Times
Heuristic solutions Simple greedy procedures
• Closest unvisited city• Variations
• Closest unvisited city based on reduced matrix (relative distance)• Closest unvisited pair of cities (using look-ahead)• Applying procedure for every city as origin
Insertion procedure1. Select two cities arbitrarily and make partial tour
2. Insert remaining cities at every possible place of current partial tour one by one, and choose best one.
3. Repeat Step 2 until complete tour is constructed.
General search methods Huge amount..
26
Summary
Generalization of basic single-machine model More applicability and new difficulties
Dynamic models Job preemption
• Preempt-resume, preempt-repeat
Inserted idle times Look-ahead procedures
Precedence constraints Strings and chains Series-parallel precedence
Sequence-dependent setup times Traveling salesman problem
END OF SINGLE MACHINE!! AT LAST!!