Danzig-Wolfe Decomposition
Problem description
Remove a constraint from the program would dramatically simplify the solving process.
Assume the program is below and the constraint \(\ref{eq1}\) is complicated constraint.
Initially in total \(\sum_{j}n_j\) variables and \(m+\sum_jm_j\) constraints.
Danzig-Wolfe Reformulation
Replace \(x_i\) by linear combination of the extreme points in feasible region of \(B_ix_i=b_i\).
Formally, Let \(\mathcal S_j=\{x_j|x_j\geq 0,B_jx_j=b_j\}\) be the set of feasible solutions for constraint \(j\), which is polyhedron and let \(\{x_j^1,...,x_j^{K_j}\}\) denote the extreme points of \(\mathcal S_j\). Then we can express any point in \(S_j\) as
where \(\sum_{i\in K_j}u_j^i=1\).
New program would be:
with constraint:
The new program has variables \(\sum_iK_i\) (can be very large), constraints \(m+n\).
Observe that \(\sum_{k=1}^{K_j}x_j^ku_j^k\) represents points in \(\mathcal S_j\) and, as such, they respect constraint \(B_{j}\left(\sum_{k=1}^{K_{j}} x_{j}^{k} u_{j}^{k}\right)=b_{j}\).
Danzig-Wolfe Decomposition
The idea is to solve \(MP\) use Simplex method without having all the data available.
Recap of Simplex
Consider LP:
We can partition the variables in basis and non-basis as:
Then we can represent the objective using only non-basis variables, which is:
and the objective function now turns:
Refer the coefficient of \(x_N\) as the reduced cost coefficient (RCC). If RCC is negative, increase \(x_N\) can only decrease objective. That is, we would want to bring a non-basic variable in the basis to replace one of the current basic variables. If RCC greater or equal than 0, Simplex terminates.
Consider the dual of the program, we have:
By strong duality, we have:
Thence the RCC is now \(c_{N}^{\top}-c_{B}^{\top} B^{-1} N=c_{N}^{\top}-\pi^{*} N\).
DW decomposition
Initially \(m+n\) columns available in Reduced master problem (RMP), columns are of:
The RCC (coefficient of \(u_j^k\)) now has version:
Solve the RMP to optimality. Check other columns not included whether their RCC is smaller than 0 and add the one into RMP, resolve. We select the column by solve:
That is, we are looking for the extreme point \(\mathcal S_j\) with the most negative reduced cost. Let \(\bar x_j\) be the solution and we let
If \(\delta<0\), form a new column:
with cost \(c_j\bar x_j^*\) and add it to RMP.
Bounds
-
Any time we solve the RMP, its objective provides us an upper bound.
Because it is possible that during the course of the algorithm we will find new columns with a negative reduced cost that will decrease the objective value.
-
Every time we solve the subproblem we can obtain a lower bound.
Assume the solution \(\bar x_j^v\) is the solution in a given iteration \(v\) and solution \(x_j\) is global optimal. we have:
\[\begin{aligned} \sum_{j=1}^{n}\left(c_{j}-\pi^{v} A_{j}\right) \bar{x}_{j}^{v} & \leq \sum_{j=1}^{n}\left(c_{j}-\pi^{v} A_{j}\right) \bar{x}_{j}\\ &=\sum_{j=1}^{n} c_{j} \bar{x}_{j}-\pi^{v} \sum_{j=1}^{n} A_{j} \bar{x}_{j} \\ &=\sum_{j=1}^{n} c_{j} \bar{x}_{j}-\pi^{v} b \end{aligned} \]which implies:
\[\sum_{j=1}^{n} c_{j} \bar{x}_{j} \geq \sum_{j=1}^{n}\left(c_{j}-\pi^{v} A_{j}\right) \bar{x}_{j}^{v}+\pi^{v} b \]