HiveBrain v1.2.0
Get Started
← Back to all entries
snippetpythonModerate

Knapsack branch and bound: forward filter

Submitted by: @import:stackexchange-codereview··
0
Viewed 0 times
forwardboundfilterandknapsackbranch

Problem

I've coded branch and bound to solve the knapsack problem, and use a greedy linear relaxation to find an upper bound on a node I'm currently exploring. What I mean by this is that I first sort the items by their respective value density (computed value/weight, for each item) and then fill the bag according to a greedy algorithm, using a fraction of the last item that would put us over the bag's capacity. This solution is unattainable (we can't take half an item) but provides an upper bound on the maximum value we can attain from considering an item j:

def _linear_relaxation(self, j, cp, cw):
    # we consider how well we can do to increase our current profit (cp) 
    # by looking at items beyond the item that we either just took or left behind 

    weight = cw
    value  = cp

    for (v, w), index in self.ordered:
        if index > j:

            if w + weight <= self.capacity:
                value  += v
                weight += w
            else:
                return value + (v * (self.capacity - w)/float(w))

    return value


My code works great to solve small problems using this relaxation, but starts slowing down when the problem size increases. One optimization (or reduction on a node's upper bound) is to avoid taking items of high value density whose weight exceeds the leftover weight we're trying to fill.

Think about it this way: our greedy linear relaxation bound takes items in decreasing order according to their density, value/weight. If the residual or remaining weight in the bag is less than the item we're considering adding to the bound, it is much better to take items further down our density list. Consider, for example, residual = 1, weight = 2. Wouldn't it be great if we found a small value, so that we could add to our estimate value/2, rather than 2 * value? We'd get a much better estimate on the least upper bound than if we considered the last item with the greatest density, with no regard as to its relative e

Solution

Your first upper bound is buggy

return value + (v * (self.capacity - w)/float(w))


should be:

return value + (v * (self.capacity - weight)/float(w))


If w_i dk*W then V_ub V_opt and it is proved that this is an upper bound.

Edit/Addendum: Improve your lower bound

You initialize a best node with a greedy solution at the start here:
Then you update the best solution here (similarly for left node):

if right.V > best.V:            # update our best, if possible 
                    best = right


Then you calculate the upper bound with the linear relaxation method:

bound = self._linear_relaxation(index, right.V, right.W)


And then only if a branch has a upper bound that is better than the best solution's partial value do you continue evaluating the branch:

if bound >= best.V:
                    stack.append(right)


This efficiently make the best solution's partial value your lower bound. The lower and upper bounds should as tightly as possible bound the best achievable value in that branch. Taking the partial value in the branch this far, grossly underestimates the lower bound. Making your search space larger. Remember, your search space grows with your bounds.

You set the lower bound here (similarly for the left case):

right = TreeNode( index + 1, current.V + self.values[index],
                              current.W + self.weights[index], taken )


Right.V is used as a lower bound which it technically is, albeit a poor one.

Instead of:

if right.V > best.V:            # update our best, if possible 
                    best = right


You should do:

# Somewhere earlier,
    best = self._greedy_solution()
    best_lb = best.V; # Here V is a lower bound on the whole search space. Good!

                # Later on...
                right = TreeNode( index + 1, current.V + self.values[index],
                              current.W + self.weights[index], taken )
                lowerBound = greedy_solution_on_remaining_items(right)
                if lowerBound > best_lb:            # update our best, if possible 
                    best = right
                     best_lb = lowerBound


And similar checks for the left case.

Code Snippets

return value + (v * (self.capacity - w)/float(w))
return value + (v * (self.capacity - weight)/float(w))
r_0 = k
r_1 = r_2 = k - dk
w_0 = W - dw
w_1 = w_2 = W/2
V_ub = r_0*w_0 = k*(W-dw) = k*W - dw*K
V_optimal = r_1*w_1 + r_2*w_2 = 2*r_1*w_1 = 2*(k-dk)*W/2 = k*W - dk*W

Context

StackExchange Code Review Q#44421, answer score: 13

Revisions (0)

No revisions yet.