Random search

Random search (RS) is a family of numerical optimization methods that do not require the gradient of the problem to be optimized, and RS can hence be used on functions that are not continuous or differentiable. Such optimization methods are also known as direct-search, derivative-free, or black-box methods.

The name "random search" is attributed to Rastrigin[1] who made an early presentation of RS along with basic mathematical analysis. RS works by iteratively moving to better positions in the search-space, which are sampled from a hypersphere surrounding the current position.

The algorithm described herein is a type of local random search, where every iteration is dependent on the prior iteration's candidate solution. There are alternative random search methods which sample from the entirety of the search space (for example pure random search or uniform global random search), but these are not described in this article.

Algorithm

Let f: n → ℝ be the fitness or cost function which must be minimized. Let x ∈ ℝn designate a position or candidate solution in the search-space. The basic RS algorithm can then be described as:

  1. Initialize x with a random position in the search-space.
  2. Until a termination criterion is met (e.g. number of iterations performed, or adequate fitness reached), repeat the following:
    1. Sample a new position y from the hypersphere of a given radius surrounding the current position x (see e.g. Marsaglia's technique for sampling a hypersphere.)
    2. If f(y) < f(x) then move to the new position by setting x = y

Variants

A number of RS variants have been introduced in the literature:

  • Fixed Step Size Random Search (FSSRS) is Rastrigin's [1] basic algorithm which samples from a hypersphere of fixed radius.
  • Optimum Step Size Random Search (OSSRS) by Schumer and Steiglitz [2] is primarily a theoretical study on how to optimally adjust the radius of the hypersphere so as to allow for speedy convergence to the optimum. An actual implementation of the OSSRS needs to approximate this optimal radius by repeated sampling and is therefore expensive to execute.
  • Adaptive Step Size Random Search (ASSRS) by Schumer and Steiglitz [2] attempts to heuristically adapt the hypersphere's radius: two new candidate solutions are generated, one with the current nominal step size and one with a larger step-size. The larger step size becomes the new nominal step size if and only if it leads to a larger improvement. If for several iterations neither of the steps leads to an improvement, the nominal step size is reduced.
  • Optimized Relative Step Size Random Search (ORSSRS) by Schrack and Choit [3] approximate the optimal step size by a simple exponential decrease. However, the formula for computing the decrease-factor is somewhat complicated.
gollark: Although lots of countries seem to have somewhat similar issues, if obviously not exactly the same ones.
gollark: American politics and such is just complete bees for presumably complicated wide-ranging reasons.
gollark: Clearly, it's America which is [REDACTED]ing up America.
gollark: Good, walls are meant to not shift.
gollark: You would probably be incinerated or something.

See also

References

  1. Rastrigin, L.A. (1963). "The convergence of the random search method in the extremal control of a many parameter system". Automation and Remote Control. 24 (10): 1337–1342.
  2. Schumer, M.A.; Steiglitz, K. (1968). "Adaptive step size random search". IEEE Transactions on Automatic Control. 13 (3): 270–276. CiteSeerX 10.1.1.118.9779. doi:10.1109/tac.1968.1098903.
  3. Schrack, G.; Choit, M. (1976). "Optimized relative step size random searches". Mathematical Programming. 10 (1): 230–244. doi:10.1007/bf01580669.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.