zoukankan      html  css  js  c++  java
  • 应用运筹学4Danzig Wolfe Decomposition

    Danzig-Wolfe Decomposition

    Problem description

    Remove a constraint from the program would dramatically simplify the solving process.

    Assume the program is below and the constraint \(\ref{eq1}\) is complicated constraint.

    \[\begin{align} \min.\sum_{i}c_ix_i&\\ s.t. \sum_iA_ix_i&=b\label{eq1}\\ B_ix_i&=b_i,\forall i \end{align} \]

    Initially in total \(\sum_{j}n_j\) variables and \(m+\sum_jm_j\) constraints.

    Danzig-Wolfe Reformulation

    Replace \(x_i\) by linear combination of the extreme points in feasible region of \(B_ix_i=b_i\).

    Formally, Let \(\mathcal S_j=\{x_j|x_j\geq 0,B_jx_j=b_j\}\) be the set of feasible solutions for constraint \(j\), which is polyhedron and let \(\{x_j^1,...,x_j^{K_j}\}\) denote the extreme points of \(\mathcal S_j\). Then we can express any point in \(S_j\) as

    \[x_j=\sum_{i\in K_j}u_j^ix_j^i \]

    where \(\sum_{i\in K_j}u_j^i=1\).

    New program would be:

    \[MP(u_j^k):\min \sum_{j=1}^{n} c_{j}\left(\sum_{k=1}^{K_{j}} x_{j}^{k} u_{j}^{k}\right) \]

    with constraint:

    \[\begin{array}{ll} \text { s.t. } & \sum_{j=1}^{n} A_{j}\left(\sum_{k=1}^{K_{j}} x_{j}^{k} u_{j}^{k}\right)=b \\ & \sum_{k=1}^{K_{j}} u_{j}^{k}=1, j\in 1,...,n \\ & u_{j}^{k} \geq 0 \end{array} \]

    The new program has variables \(\sum_iK_i\) (can be very large), constraints \(m+n\).

    Observe that \(\sum_{k=1}^{K_j}x_j^ku_j^k\) represents points in \(\mathcal S_j\) and, as such, they respect constraint \(B_{j}\left(\sum_{k=1}^{K_{j}} x_{j}^{k} u_{j}^{k}\right)=b_{j}\).

    Danzig-Wolfe Decomposition

    The idea is to solve \(MP\) use Simplex method without having all the data available.

    Recap of Simplex

    Consider LP:

    \[\begin{aligned} \min z=& c^{\top} x \label{eq2} \\ & A x=b \\ & x \geq 0 \end{aligned} \]

    We can partition the variables in basis and non-basis as:

    \[\begin{aligned} \min z=& c_{B}^{\top} x_{B}+c_{N}^{\top} x_{N} \\ & B x_{B}+N x_{N}=b \\ & x_{B}, x_{N} \geq 0 \end{aligned} \]

    Then we can represent the objective using only non-basis variables, which is:

    \[x_{B}=B^{-1} b-B^{-1} N x_{N} \]

    and the objective function now turns:

    \[z=c_{B}^{\top}\left(B^{-1} b-B^{-1} N x_{N}\right)+c_{N}^{\top} x_{N}\\ =c_{B}^{\top}\left(B^{-1} b\right)+\left(c_{N}^{\top}-c_{B}^{\top} B^{-1} N\right) x_{N} \]

    Refer the coefficient of \(x_N\) as the reduced cost coefficient (RCC). If RCC is negative, increase \(x_N\) can only decrease objective. That is, we would want to bring a non-basic variable in the basis to replace one of the current basic variables. If RCC greater or equal than 0, Simplex terminates.

    Consider the dual of the program, we have:

    \[\begin{aligned} \max w=& \pi^{\top} b \\ & \pi A \leq c \end{aligned} \]

    By strong duality, we have:

    \[z^{*}=c_{B}^{\top} B^{-1} b=w^{*}=\pi^{*} b \]

    Thence the RCC is now \(c_{N}^{\top}-c_{B}^{\top} B^{-1} N=c_{N}^{\top}-\pi^{*} N\).

    DW decomposition

    Initially \(m+n\) columns available in Reduced master problem (RMP), columns are of:

    \[\left(A_{j} x_{j}^{k}, 0, \ldots, 0,1,0 \ldots, 0\right) \]

    The RCC (coefficient of \(u_j^k\)) now has version:

    \[c_{j} x_{j}^{k}-\pi^{*} A_{j} x_{j}^{k}-\sigma_{j}^{*} \]

    Solve the RMP to optimality. Check other columns not included whether their RCC is smaller than 0 and add the one into RMP, resolve. We select the column by solve:

    \[\begin{gathered} \min \left(c_{j}-\pi^{*} A_{j}\right) x_{j}-\sigma_{j}^{*} \\ B_{j} x_{j}=b_{j}\\ x_j\geq 0 \end{gathered} \]

    That is, we are looking for the extreme point \(\mathcal S_j\) with the most negative reduced cost. Let \(\bar x_j\) be the solution and we let

    \[\delta=\min _{j=1, \ldots, n}\left\{\left(c_{j}-\pi^{*} A_{j}\right) \bar{x}_{j}-\sigma_{j}^{*}\right\} \]

    If \(\delta<0\), form a new column:

    \[\left(A_{j} \bar{x}_{j *}, 0, \ldots, 0,1,0, \ldots, 0\right) \]

    with cost \(c_j\bar x_j^*\) and add it to RMP.

    Bounds

    1. Any time we solve the RMP, its objective provides us an upper bound.

      Because it is possible that during the course of the algorithm we will find new columns with a negative reduced cost that will decrease the objective value.

    2. Every time we solve the subproblem we can obtain a lower bound.

      Assume the solution \(\bar x_j^v\) is the solution in a given iteration \(v\) and solution \(x_j\)​ is global optimal. we have:

      \[\begin{aligned} \sum_{j=1}^{n}\left(c_{j}-\pi^{v} A_{j}\right) \bar{x}_{j}^{v} & \leq \sum_{j=1}^{n}\left(c_{j}-\pi^{v} A_{j}\right) \bar{x}_{j}\\ &=\sum_{j=1}^{n} c_{j} \bar{x}_{j}-\pi^{v} \sum_{j=1}^{n} A_{j} \bar{x}_{j} \\ &=\sum_{j=1}^{n} c_{j} \bar{x}_{j}-\pi^{v} b \end{aligned} \]

      which implies:

      \[\sum_{j=1}^{n} c_{j} \bar{x}_{j} \geq \sum_{j=1}^{n}\left(c_{j}-\pi^{v} A_{j}\right) \bar{x}_{j}^{v}+\pi^{v} b \]

  • 相关阅读:
    Linux内核参数之 overcommit_memory
    Nodejs安装模块之 mysql-libmysqlclient
    Mongo分片维护之平衡器
    Mac下搭建php开发环境【转】
    Mongodb 启动参数
    Nodejs npm安装某些包失败
    linux shell 字符串操作(长度,查找,替换)
    个人作业——软件工程实践总结&个人技术博客
    个人作业——软件评测
    结对第二次作业——某次疫情统计可视化的实现
  • 原文地址:https://www.cnblogs.com/romaLzhih/p/15582886.html
Copyright © 2011-2022 走看看