A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from http://accord-framework.net/docs/html/T_Accord_Math_Optimization_Cobyla.htm below:

Cobyla Class

Constrained optimization by linear approximation.

Inheritance Hierarchy Namespace:  Accord.Math.Optimization
Assembly:

Accord.Math (in Accord.Math.dll) Version: 3.8.0

Syntax
public class Cobyla : BaseOptimizationMethod, IOptimizationMethod, 
	IOptimizationMethod<double[], double>, IOptimizationMethod<CobylaStatus>, 
	IOptimizationMethod<double[], double, CobylaStatus>
Public Class Cobyla
	Inherits BaseOptimizationMethod
	Implements IOptimizationMethod, IOptimizationMethod(Of Double(), Double), 
	IOptimizationMethod(Of CobylaStatus), IOptimizationMethod(Of Double(), Double, CobylaStatus)
Request Example View Source

The Cobyla type exposes the following members.

Constructors Properties Methods   Name Description Equals

Determines whether the specified object is equal to the current object.

(Inherited from Object.) Finalize

Allows an object to try to free resources and perform other cleanup operations before it is reclaimed by garbage collection.

(Inherited from Object.) GetHashCode

Serves as the default hash function.

(Inherited from Object.) GetType

Gets the Type of the current instance.

(Inherited from Object.) Maximize

Finds the maximum value of a function. The solution vector will be made available at the

Solution

property.

(Inherited from BaseOptimizationMethod.) Maximize(Double)

Finds the maximum value of a function. The solution vector will be made available at the

Solution

property.

(Inherited from BaseOptimizationMethod.) MemberwiseClone

Creates a shallow copy of the current Object.

(Inherited from Object.) Minimize

Finds the minimum value of a function. The solution vector will be made available at the

Solution

property.

(Inherited from BaseOptimizationMethod.) Minimize(Double)

Finds the minimum value of a function. The solution vector will be made available at the

Solution

property.

(Inherited from BaseOptimizationMethod.) OnNumberOfVariablesChanged

Called when the

NumberOfVariables

property has changed.

(Inherited from BaseOptimizationMethod.) Optimize

Implements the actual optimization algorithm. This method should try to minimize the objective function.

(Overrides BaseOptimizationMethodOptimize.) ToString

Returns a string that represents the current object.

(Inherited from Object.) Top Extension Methods   Name Description HasMethod

Checks whether an object implements a method with the given name.

(Defined by ExtensionMethods.) IsEqual

Compares two objects for equality, performing an elementwise comparison if the elements are vectors or matrices.

(Defined by Matrix.) To(Type) Overloaded.

Converts an object into another type, irrespective of whether the conversion can be done at compile time or not. This can be used to convert generic types to numeric types during runtime.

(Defined by ExtensionMethods.) ToT Overloaded.

Converts an object into another type, irrespective of whether the conversion can be done at compile time or not. This can be used to convert generic types to numeric types during runtime.

(Defined by ExtensionMethods.) Top Remarks

Constrained optimization by linear approximation (COBYLA) is a numerical optimization method for constrained problems where the derivative of the objective function is not known, invented by Michael J. D. Powell.

COBYLA2 is an implementation of Powell’s nonlinear derivative–free constrained optimization that uses a linear approximation approach. The algorithm is a sequential trust–region algorithm that employs linear approximations to the objective and constraint functions, where the approximations are formed by linear interpolation at n + 1 points in the space of the variables and tries to maintain a regular–shaped simplex over iterations.

This algorithm is able to solve non-smooth NLP problems with a moderate number of variables (about 100), with inequality constraints only.

References:

Examples

Let's say we would like to optimize a function whose gradient we do not know or would is too difficult to compute. All we have to do is to specify the function, pass it to Cobyla and call its Minimize() method:

Func<double[], double> function = x => 10 * Math.Pow(x[0] + 1, 2) + Math.Pow(x[1], 2);


Cobyla cobyla = new Cobyla(2, function);

bool success = cobyla.Minimize();

double minimum = minimum = cobyla.Value; 
double[] solution = cobyla.Solution;     

Cobyla can be used even when we have constraints in our optimization problem. The following example can be found in Fletcher's book Practical Methods of Optimization, under the equation number (9.1.15).

var f = new NonlinearObjectiveFunction(2, x => -x[0] - x[1]);


var constraints = new[]
{
    new NonlinearConstraint(2, x =>             x[1] - x[0] * x[0] >= 0),
    new NonlinearConstraint(2, x =>  1 - x[0] * x[0] - x[1] * x[1] >= 0),
};


var cobyla = new Cobyla(function, constraints);


bool success = cobyla.Minimize();
double minimum = cobyla.Value;        
double[] solution = cobyla.Solution;  
See Also

RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4