Branch and cut

Branch and cut[1] is a method of combinatorial optimization for solving integer linear programs (ILPs), that is, linear programming (LP) problems where some or all the unknowns are restricted to integer values.[2] Branch and cut involves running a branch and bound algorithm and using cutting planes to tighten the linear programming relaxations. Note that if cuts are only used to tighten the initial LP relaxation, the algorithm is called cut and branch.

Algorithm

This description assumes the ILP is a maximization problem.

The method solves the linear program without the integer constraint using the regular simplex algorithm. When an optimal solution is obtained, and this solution has a non-integer value for a variable that is supposed to be integer, a cutting plane algorithm may be used to find further linear constraints which are satisfied by all feasible integer points but violated by the current fractional solution. These inequalities may be added to the linear program, such that resolving it will yield a different solution which is hopefully "less fractional".

At this point, the branch and bound part of the algorithm is started. The problem is split into multiple (usually two) versions. The new linear programs are then solved using the simplex method and the process repeats. During the branch and bound process, non-integral solutions to LP relaxations serve as upper bounds and integral solutions serve as lower bounds. A node can be pruned if an upper bound is lower than an existing lower bound. Further, when solving the LP relaxations, additional cutting planes may be generated, which may be either global cuts, i.e., valid for all feasible integer solutions, or local cuts, meaning that they are satisfied by all solutions fulfilling the side constraints from the currently considered branch and bound subtree.

The algorithm is summarized below.

  1. Add the initial ILP to , the list of active problems
  2. Set and
  3. while is not empty
    1. Select and remove (de-queue) a problem from
    2. Solve the LP relaxation of the problem.
    3. If the solution is infeasible, go back to 3 (while). Otherwise denote the solution by with objective value .
    4. If , go back to 3.
    5. If is integer, set and go back to 3.
    6. If desired, search for cutting planes that are violated by . If any are found, add them to the LP relaxation and return to 3.2.
    7. Branch to partition the problem into new problems with restricted feasible regions. Add these problem to and go back to 3
  4. return

Pseudocode

In C++-like pseudocode, this could be written:

 1 // ILP branch and cut solution pseudocode, assuming objective is to be maximized
 2 ILP_solution branch_and_cut_ILP(IntegerLinearProgram initial_problem) {
 3     queue active_list; // L, above
 4     active_list.enqueue(initial_problem); // step 1
 5     // step 2
 6     ILP_solution optimal_solution; // this will hold x* above
 7     double best_objective = -std::numeric_limits<double>::infinity; // will hold v* above
 8     while (!active_list.empty()) { // step 3 above
 9         LinearProgram& curr_prob = active_list.dequeue(); // step 3.1
10         do { // steps 3.2-3.7
11             RelaxedLinearProgram& relaxed_prob = LP_relax(curr_prob); // step 3.2
12             LP_solution curr_relaxed_soln = LP_solve(relaxed_prob); // this is x above
13             bool cutting_planes_found = false;
14             if (!curr_relaxed_soln.is_feasible()) { // step 3.3
15                 continue; // try another solution; continues at step 3
16             }
17             double current_objective_value = curr_relaxed_soln.value(); // v above
18             if (current_objective_value <= best_objective) { // step 3.4
19                 continue; // try another solution; continues at step 3
20             }
21             if (curr_relaxed_soln.is_integer()) { // step 3.5
22                 best_objective = current_objective_value;
23                 optimal_solution = cast_as_ILP_solution(curr_relaxed_soln);
24                 continue; // continues at step 3
25             }
26             // current relaxed solution isn't integral
27             if (hunting_for_cutting_planes) { // step 3.6
28                 violated_cutting_planes = search_for_violated_cutting_planes(curr_relaxed_soln);
29                 if (!violated_cutting_planes.empty()) { // step 3.6
30                     cutting_planes_found = true; // will continue at step 3.2
31                     for (auto&& cutting_plane : violated_cutting_planes) {
32                         active_list.enqueue(LP_relax(curr_prob, cutting_plane));
33                     }
34                     continue; // continues at step 3.2
35                 }
36             }
37             // step 3.7: either violated cutting planes not found, or we weren't looking for them
38             auto&& branched_problems = branch_partition(curr_prob);
39             for (auto&& branch : branched_problems) {
40                 active_list.enqueue(branch);
41             }
42             continue; // continues at step 3
43         } while (hunting_for_cutting_planes /* parameter of the algorithm; see 3.6 */
44                && cutting_planes_found);
45         // end step 3.2 do-while loop
46     } // end step 3 while loop
47     return optimal_solution; // step 4
48 }

In the above pseudocode, the functions LP_relax, LP_solve and branch_partition called as subroutines must be provided as applicable to the problem. For example, LP_solve could call the simplex algorithm. Branching strategies for branch_partition are discussed below.

Branching strategies

An important step in the branch and cut algorithm is the branching step. At this step, there are a variety of branching heuristics that can be used. The branching strategies described below all involve what is called branching on a variable.[3] Branching on a variable involves choosing a variable, , with a fractional value, , in the optimal solution to the current LP relaxation and then adding the constraints and

  • Most Infeasible Branching This branching strategy chooses the variable with the fractional part closest to 0.5.
  • Pseudo Cost Branching The basic idea of this strategy is to keep track for each variable the change in the objective function when this variable was previously chosen as the variable to branch on. The strategy then chooses the variable that is predicted to have the most change on the objective function based on past changes when it was chosen as the branching variable. Note that pseudo cost branching is initially uninformative in the search since few variables have been branched on.
  • Strong Branching Strong branching involves testing which of the candidate variable gives the best improvement to the objective function before actually branching on them. Full strong branching tests all candidate variables and can be computationally expensive. The computational cost can be reduced by only considering a subset of the candidate variables and not solving each of the corresponding LP relaxations to completion.

There are also a large number of variations of these branching strategies, such as using strong branching early on when pseudo cost branching is relatively uninformative and then switching to pseudo cost branching later when there is enough branching history for pseudo cost to be informative.

gollark: This may be one of the highest-performance `malloc` implementations available.
gollark: ```c#include <stdint.h>#include <stddef.h>static uintptr_t MEMPOS = 1;void* malloc(size_t size) { uintptr_t bees = MEMPOS; MEMPOS += size; return (void*)bees;}void free(void* ptr) { *(char**)ptr = "hello please do not use this address";}```
gollark: I prefer my malloc implementation.
gollark: discord.bash when?
gollark: I only run autoupdating untrusted shellscripts.

References

  1. Padberg, Manfred; Rinaldi, Giovanni (1991). "A Branch-and-Cut Algorithm for the Resolution of Large-Scale Symmetric Traveling Salesman Problems". SIAM Review. 33 (1): 60–100. doi:10.1137/1033004. ISSN 0036-1445.
  2. John E., Mitchell (2002). "Branch-and-Cut Algorithms for Combinatorial Optimization Problems" (PDF). Handbook of Applied Optimization: 65–77.
  3. Achterberg, Tobias; Koch, Thorsten; Martin, Alexander (2005). "Branching rules revisited". Operations Research Letters. 33 (1): 42–54. doi:10.1016/j.orl.2004.04.002.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.