Mountain climbing algorithm and Python implementation examples

  • 2020-04-02 13:35:40
  • OfStack

An introduction to mountain climbing

Climbing method is an optimization algorithm that generally starts with a random solution and works its way up to an optimal solution (locally optimal). Assuming that the problem has multiple parameters, we can increase or decrease the value of a parameter by one unit in the process of obtaining the optimal solution step by step by mountain climbing. For example, the solution of a problem needs to use three parameters of integer type x1, x2 and x3. At the beginning, set these three parameters as (2,2, minus 2), increase/decrease x1 by 1, and get two solutions (1,2, minus 2), (3, 2, minus 2). Increase/decrease x2 by 1 to get two solutions (2,3, minus 2), (2,1, minus 2); Increase/decrease x3 by 1 to get two solutions (2,2,-1), (2,2,-3), so as to get a solution set:
(2, 2, 2), (1, 2, 2), (3, 2, 2), (2, 3, 2), (2, 1, 2), (2, 2, 1), (2, 2, 3)
The optimal solution is found from the above solution set, and then the optimal solution is constructed according to the above method, and then the optimal solution is found, and so on, until the previous optimal solution and the next optimal solution are the same.

2. A Python example

Let's say the equation y is equal to x1 plus x2 minus x3, x1 is an integer in the interval [minus 2, 5], x2 is an integer in the interval [2, 6], and x3 is an integer in the interval [minus 5, 2]. Using the mountain-climbing method, find the solution that minimizes the value of y.

The code is as follows:


import random
def evaluate(x1, x2, x3):
    return x1+x2-x3
if __name__ == '__main__':
    x_range = [ [-2, 5], [2, 6], [-5, 2] ]
    best_sol = [random.randint(x_range[0][0], x_range[0][1]), 
           random.randint(x_range[1][0], x_range[1][1]), 
           random.randint(x_range[2][0], x_range[2][1])]
    while True:
        best_evaluate = evaluate(best_sol[0], best_sol[1], best_sol[2])
        current_best_value = best_evaluate
        sols = [best_sol]
        for i in xrange(len(best_sol)):
            if best_sol[i] > x_range[i][0]:
                sols.append(best_sol[0:i] + [best_sol[i]-1] + best_sol[i+1:])
            if best_sol[i] < x_range[i][1]:
                sols.append(best_sol[0:i] + [best_sol[i]+1] + best_sol[i+1:])
        print sols
        for s in sols:
            el = evaluate(s[0], s[1], s[2])
            if el < best_evaluate:
                best_sol = s
                best_evaluate = el
        if best_evaluate == current_best_value:
            break
    print 'best sol : ', current_best_value, best_sol
 The results of a certain operation are as follows: 
[[0, 5, 1], [-1, 5, 1], [1, 5, 1], [0, 4, 1], [0, 6, 1], [0, 5, 0], [0, 5, 2]]
[[-1, 5, 1], [-2, 5, 1], [0, 5, 1], [-1, 4, 1], [-1, 6, 1], [-1, 5, 0], [-1, 5, 2]]
[[-2, 5, 1], [-1, 5, 1], [-2, 4, 1], [-2, 6, 1], [-2, 5, 0], [-2, 5, 2]]
[[-2, 4, 1], [-1, 4, 1], [-2, 3, 1], [-2, 5, 1], [-2, 4, 0], [-2, 4, 2]]
[[-2, 3, 1], [-1, 3, 1], [-2, 2, 1], [-2, 4, 1], [-2, 3, 0], [-2, 3, 2]]
[[-2, 2, 1], [-1, 2, 1], [-2, 3, 1], [-2, 2, 0], [-2, 2, 2]]
[[-2, 2, 2], [-1, 2, 2], [-2, 3, 2], [-2, 2, 1]]
best sol :  -2 [-2, 2, 2]

As you can see, the optimal solution is minus 2, and the corresponding x1, x2 and x3 are evaluated at minus 2, 2 and 2.

Iii. How to find the global optimum

The optimal solution obtained by mountain-climbing method may be the local optimal solution. If a better solution is to be obtained, the mountain-climbing algorithm (starting from different initial solutions) is used for many times to find the optimal solution from multiple local optimal solutions, and this optimal solution may also be the global optimal solution.

In addition, simulated annealing algorithm is also an algorithm trying to find the global optimal solution.

 


Related articles: