A module containing continuous testing functions for optimisation in one, two and multi-dimensional implementations. Intended for testing the performance of mathematical optimisation routines. Many are taken from http://www.sfu.ca/~ssurjano/optimization.html.
Each test function is implemented as a class with up to three methods for calculating the cost, gradient and Hessian information of the test functions. Additional information such as the global minimum value and position, the function's domain and cost function are also provided.
Each test function has up to three methods which calculate the cost, gradient and Hessian information. It also contains much other useful information. The implementation of the 2D Rosenbrock is shown below.
class Rosenbrock(Function2D):
""" Rosenbrock Function. """
def __init__(self):
""" Constructor. """
# Information
self.min = np.array([1.0, 1.0])
self.value = 0.0
self.domain = np.array([[-np.inf, np.inf], [-np.inf, np.inf]])
self.n = 2
self.smooth = True
self.info = [True, True, True]
# Description
self.latex_name = "Rosenbrock Function"
self.latex_type = "Valley Shaped"
self.latex_cost = r"\[ f(\boldsymbol{x}) = \sum_{i=0}^{d-2} \left[ 100 \left(x_{i+1}
- x_{i}^{2}\right)^{2} + \left(x_{i} - 1\right)^{2}\right] \]"
self.latex_desc = "The Rosenbrock function, also referred to as the Valley or Banana function, is a popular " \
"test problem for gradient-based optimization algorithms. It is shown in the plot above in " \
"its two-dimensional form. The function is unimodal, and the global minimum lies in a " \
"narrow, parabolic valley. However, even though this valley is easy to find, convergence " \
"to the minimum is difficult."
def cost(self, x):
""" Cost function. """
# Cost
c = np.zeros(x.shape[1:])
# Calculate Cost
c = 100.0*(x[1] - x[0]**2.0)**2.0 + (x[0] - 1.0)**2.0
# Return Cost
return c
def grad(self, x):
""" Grad function. """
# Grad
g = np.zeros(x.shape)
# Calculate Grads
g[0] = -400.0*x[0]*(x[1] - x[0]**2.0) + 2.0*(x[0] - 1.0)
g[1] = 200.0*(x[1] - x[0]**2.0)
# Return Grad
return g
def hess(self, x):
""" Hess function. """
# Hess
h = np.zeros((2, 2) + x.shape[1:])
# Calculate Hess
h[0][0] = -400.0*x[1] + 1200.0*x[0]**2.0 + 2.0
h[0][1] = -400.0*x[0]
h[1][0] = h[0][1]
h[1][1] = 200.0
# Return Hess
return h
The following example shows how the module can be used.
>>> import numpy as np
>>> import matplotlib.pyplot as plt
>>> from ctf.functions2d import Beale
>>> func = Beale()
>>> func.latex_desc
The Beale function is multimodal, with sharp peaks at the corners of the input domain.
>>> func.domain
[[-4.5 4.5]
[-4.5 4.5]]
>>> func.cost(np.array([2.0, 0.0]))
0.703125
>>> func.grad(np.array([2.0, 0.0]))
[-0.75 -2. ]
>>> func.hess(np.array([2.0, 0.0]))
[[ 6. -5.]
[ -5. 10.]]
>>> func.plot_cost()
>>> plt.show()
Requires Python 3 and depends on Numpy and MatPlotLib.
Many Local Minima
Bowl Shaped
Steep
Other
Many Local Minima
- Ackley Function
- Bukin Function No. 6
- Cross-in-Tray Function
- Drop-Wave Function
- Eggholder Function
- Griewank Function
- Holder Table Function
- Levy Function No. 13
- Rastrigin Function
- Schaffer Function No. 2
- Schaffer Function No. 4
- Schwefel Function
- Shubert Function
Bowl Shaped
- Bohachevsky No. 1 Function
- Bohachevsky No. 2 Function
- Bohachevsky No. 3 Function
- Perm Function
- Rotated Hyper-Ellipsoid Function
- Sphere Function
- Sum of Different Powers Function
- Sum Squares Function
- Trid Function
Plate-Shaped
Valley-Shaped
Steep Ridges/Drops
Other
Many Local Minima
- Ackley Function
- Griewank Function
- Rastrigin Function
- Schwefel Function
Bowl Shaped
- Perm Function
- Rotated Hyper-Ellipsoid Function
- Sphere Function
- Sum of Different Powers Function
- Sum Squares Function
- Trid Function
Plate-Shaped
- Power Sum Function
- Zakharov Function
Valley-Shaped
- Dixon-Price Function
- Rosenbrock Function
Steep Ridges/Drops
- Michalewicz Function
Other
- Styblinski-Tang Function