A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from http://accord-framework.net/docs/html/T_Accord_Math_Optimization_GaussNewton.htm below:

GaussNewton Class

Gauss-Newton algorithm for solving Least-Squares problems.

Inheritance Hierarchy Namespace:  Accord.Math.Optimization
Assembly:

Accord.Math (in Accord.Math.dll) Version: 3.8.0

Syntax
public class GaussNewton : BaseLeastSquaresMethod, 
	ILeastSquaresMethod, IConvergenceLearning
Public Class GaussNewton
	Inherits BaseLeastSquaresMethod
	Implements ILeastSquaresMethod, IConvergenceLearning
Request Example View Source

The GaussNewton type exposes the following members.

Constructors Properties Methods   Name Description ComputeError

Compute model error for a given data set.

(Inherited from BaseLeastSquaresMethod.) Equals

Determines whether the specified object is equal to the current object.

(Inherited from Object.) Finalize

Allows an object to try to free resources and perform other cleanup operations before it is reclaimed by garbage collection.

(Inherited from Object.) GetHashCode

Serves as the default hash function.

(Inherited from Object.) GetType

Gets the Type of the current instance.

(Inherited from Object.) Initialize

This method should be implemented by child classes to initialize their fields once the

NumberOfParameters

is known.

(Overrides BaseLeastSquaresMethodInitialize.) MemberwiseClone

Creates a shallow copy of the current Object.

(Inherited from Object.) Minimize

Attempts to find the best values for the parameter vector minimizing the discrepancy between the generated outputs and the expected outputs for a given set of input data.

ToString

Returns a string that represents the current object.

(Inherited from Object.) Top Extension Methods   Name Description HasMethod

Checks whether an object implements a method with the given name.

(Defined by ExtensionMethods.) IsEqual

Compares two objects for equality, performing an elementwise comparison if the elements are vectors or matrices.

(Defined by Matrix.) To(Type) Overloaded.

Converts an object into another type, irrespective of whether the conversion can be done at compile time or not. This can be used to convert generic types to numeric types during runtime.

(Defined by ExtensionMethods.) ToT Overloaded.

Converts an object into another type, irrespective of whether the conversion can be done at compile time or not. This can be used to convert generic types to numeric types during runtime.

(Defined by ExtensionMethods.) Top Remarks

This class isn't suitable for most real-world problems. Instead, this class is intended to be use as a baseline comparison to help debug and check other optimization methods, such as

LevenbergMarquardt

.

Examples

While it is possible to use the GaussNewton class as a standalone method for solving least squares problems, this class is intended to be used as a strategy for NonlinearLeastSquares, as shown in the example below:

double[,] data =
{
    { 0.03, 0.1947, 0.425, 0.626, 1.253, 2.500, 3.740 },
    { 0.05, 0.127, 0.094, 0.2122, 0.2729, 0.2665, 0.3317}
};


double[][] inputs = data.GetRow(0).ToJagged();
double[] outputs = data.GetRow(1);


var nls = new NonlinearLeastSquares()
{
    
    StartValues = new[] { 0.9, 0.2 },

    
    Function = (w, x) => (w[0] * x[0]) / (w[1] + x[0]),

    
    Gradient = (w, x, r) =>
    {
        r[0] = -((-x[0]) / (w[1] + x[0]));
        r[1] = -((w[0] * x[0]) / Math.Pow(w[1] + x[0], 2));
    },

    Algorithm = new GaussNewton()
    {
        MaxIterations = 0,
        Tolerance = 1e-5
    }
};


var regression = nls.Learn(inputs, outputs);


double[] predict = regression.Transform(inputs);
Dim data As Double(,) =
{
    {0.03, 0.1947, 0.425, 0.626, 1.253, 2.5, 3.74},
    {0.05, 0.127, 0.094, 0.2122, 0.2729, 0.2665, 0.3317}
}


Dim inputs As Double()() = data.GetRow(0).ToJagged()
Dim outputs As Double() = data.GetRow(1)


Dim nls As NonlinearLeastSquares = New NonlinearLeastSquares
With nls
    
    .StartValues = {0.9, 0.2}

    
    .Function = Function(w, x) w(0) * x(0) / (w(1) + x(0))

    
    .Gradient = Sub(w, x, r)
                    r(0) = -((-x(0)) / (w(1) + x(0)))
                    r(1) = -((w(0) * x(0)) / System.Math.Pow(w(1) + x(0), 2))
                End Sub
End With

Dim algorithm As GaussNewton = New GaussNewton
With algorithm
    .MaxIterations = 0
    .Tolerance = 0.00001
End With

nls.Algorithm = algorithm

Dim regression As NonlinearRegression = nls.Learn(inputs, outputs)


Dim predict As Double() = regression.Transform(inputs)

However, as mentioned above it is also possible to use GaussNewton as a standalone class, as shown in the example below:







double[][] inputs = Jagged.ColumnVector(new [] { 0.03, 0.1947, 0.425, 0.626, 1.253, 2.500, 3.740 });
double[] outputs = new[] { 0.05, 0.127, 0.094, 0.2122, 0.2729, 0.2665, 0.3317 };








LeastSquaresFunction function = (double[] parameters, double[] input) =>
{
    return (parameters[0] * input[0]) / (parameters[1] + input[0]);
};




LeastSquaresGradientFunction gradient = (double[] parameters, double[] input, double[] result) =>
{
    result[0] = -((-input[0]) / (parameters[1] + input[0]));
    result[1] = -((parameters[0] * input[0]) / Math.Pow(parameters[1] + input[0], 2));
};


var gn = new GaussNewton(parameters: 2)
{
    Function = function,
    Gradient = gradient,
    Solution = new[] { 0.9, 0.2 } 
};


gn.Minimize(inputs, outputs);


double b1 = gn.Solution[0]; 
double b2 = gn.Solution[1]; 
See Also

RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4