Regression Equations Assignment Help | Regression Equations Homework Help

Regression Equations

Regression equations, also known as estimating equations, are algebraic expressions of the regression lines. Since there are two regression lines, there are two regression equations – the regression equation of X on Y is used to describe the variations in the value  of X for given changes in Y and the regression equation of Y ad X is used to describe the variation in the values of Y for given changes in X.

Regression Equation of Y on X

The regression equation of Y on X is expressed as follows:
            Yc = a + bX
In this equation a and b are constants (fixed numerical values) which determine the position of the line completely. These constant are called the parameters of the line. If the value of either or both of them is changed, another line is determined. The parameter ‘a’ determines the intercept, i.e., what will be the value of Y (dependent variable) when X (independent variable) takes the value Zero. The parameter ‘b’ determines the slope of the line, i.e., the change in Y per unit change in X. the symbol Yc stands for the value of Y computed from the relationship for a given X.

If the values of the constant ‘a’ and ‘b’ are obtained, the line is completely determined. But the question is how to obtain these values. The answer is provided by the methods of Least Squares which states that the line should be drawn through the plotted points in such a manner that the sum of the squares of the deviations of the actual Y values from the computed Y values is the least, or in other words, in order to obtain a line which fits the points best ∑ (Y – Yc)2 should be minimum. Such a line is known as the line of ‘best fit.’

A straight line fitted by least squares has the following characteristics:

1. It give the best fit to the data in the sense that it makes the sum of the squared deviations from the line, ∑ (Y – Yc)2, smaller than they would be from any other straight line. This property accounts for the name ‘Least Squares.’

2. The deviation above the line equal those the line, on the average. This means that total of the positive and negative deviations is zero, or ∑ (Y – Yc) = 0.

3. The straight lines goes through the overall mean of the data regression equ. of Y and X.
4. When the data represent a sample from a larger population, the least squares line in a ‘best’ estimate of the population regression line.

With a little algebra and differential calculus it can be shown that the following two equations, if solved simultaneously, will yield values of the parameters a and b such that the least squares requirement is fulfilled.

  ∑Y = Na + b∑X
  ∑XY = a∑X + b∑X2

These equations are usually called the normal equations. In the equations EX, EY, EXY, EX2 indicate totals which are computed form the observed pairs of the values of two variables X and Y to which the least squares estimating line is to be fitted and N is the number of observed pairs of values.

Regression Equation of X and Y

Regression Equation of X on Y is expressed as follows:

    Xc = Na + bY

To determine the values of a and b the following two normal equations are to be solved simultaneously.

    ∑X = Na + b∑Y
    ∑XY = a ∑Y + b∑Y2

For more help in Regression Equations click the button below to submit your homework assignment