Least Square Fitting Assignment Help | Least Square Fitting Homework Help

Least Square Fitting

Most often you will have a set of sample data points, say (xi, yi) and you would wish to know how the yi data observations relate to the independent variable xi, for i running from 1 to n.

Through least square fitting, the data analyst can determine the best fit to the sample data set by carrying out optimisation on the model parameters rather than the independent variables.

The optimisation on the model parameters is made by minimising the following equation;

least square fitting

Where f(xi) is the function that approximates the data set (xi, yi). i.e.

fxi=a+bxi

There are two categories in which we can group least square problems:

  1. Linear

This is a model that is defined as a linear expression of the parameters i.e.


For j=1,2,3,…,m

Where aj are the parameters.

The idea is to compute the parameters such that Eq.1 is minimized. To do this we carry out partial differentiation of the function f with respect to all parameters and equate these derivatives to zero i.e.


For j=1, 2, 3,…,m

You will end up with a system of n equations with m unknowns. Solve them simultaneously to obtain the least-squares estimators of the parameters.

  1. Non-Linear

This is a model that cannot be defined as a linear expression of the parameters e.g.

y=a1+a2x+a3x2

In this model, you will have to start with approximate values of the parameters. Then you will need to iterate over these values.