Simulated annealing is a probabilistic technique used to approximate the global optimum of an objective function within a large search space. It works by randomly selecting solutions close to the current one and deciding whether to move to the new solution or stay with the current one based on probabilities that change as the "temperature" decreases during the search process. Simulated annealing was inspired by annealing in metallurgy and can be used to find approximate solutions to complex optimization problems like the traveling salesman problem.
Simulated annealing is a probabilistic technique used to approximate the global optimum of an objective function within a large search space. It works by randomly selecting solutions close to the current one and deciding whether to move to the new solution or stay with the current one based on probabilities that change as the "temperature" decreases during the search process. Simulated annealing was inspired by annealing in metallurgy and can be used to find approximate solutions to complex optimization problems like the traveling salesman problem.
Simulated annealing is a probabilistic technique used to approximate the global optimum of an objective function within a large search space. It works by randomly selecting solutions close to the current one and deciding whether to move to the new solution or stay with the current one based on probabilities that change as the "temperature" decreases during the search process. Simulated annealing was inspired by annealing in metallurgy and can be used to find approximate solutions to complex optimization problems like the traveling salesman problem.
Simulated annealing (SA) is a probabilistic technique for approximating the global
optimum of a given function. Specifically, it is a metaheuristic to approximate global optimization in a large search space for an optimization problem. It is often used when the search space is discrete (e.g., the traveling salesman problem). For problems where finding an approximate global optimum is more important than finding a precise local optimum in a fixed amount of time, simulated annealing may be preferable to alternatives such as gradient descent. The simulation annealing can be used to find an approximation of a global minimum for a function with many variables. In 1983, this approach was used by Kirkpatrick, Gelatt Jr., Vecchi, for a solution of the traveling salesman problem. They also proposed its current name, simulated annealing. Accepting worse solutions is a fundamental property of metaheuristics because it allows for a more extensive search for the global optimal solution. In general, the simulated annealing algorithms work as follows. At each time step, the algorithm randomly selects a solution close to the current one, measures its quality, and then decides to move to it or to stay with the current solution based on either one of two probabilities between which it chooses on the basis of the fact that the new solution is better or worse than the current one. During the search, the temperature is progressively decreased from an initial positive value to zero and affects the two probabilities: at each step, the probability of moving to a better new solution is either kept to 1 or is changed towards a positive value; on the other hand, the probability of moving to a worse new solution is progressively changed towards zero. The simulation can be performed either by a solution of kinetic equations for density functions or by using the stochastic sampling method. The method is an adaptation of the Metropolis–Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, published by N. Metropolis et al. in 1953.