Properties of Good Estimator
A distinction is made between an estimate and an estimator. The numerical value of the sample mean is said to be an estimate of the population mean figure. On the other hand, the statistical measure used, that is, the method of estimation is referred to as an estimator. A good estimator, as common sense dictates, is close to the parameter being estimated. Its quality is to be evaluated in terms of the following properties:
An estimator is said to be unbiased if its expected value is identical with the population parameter being estimated. That is if θ is an unbiased estimate of θ, then we must have E (θ) = θ. Many estimators are “Asymptotically unbiased” in the sense that the biases reduce to practically insignificant value (Zero) when n becomes sufficiently large. The estimator S2 is an example.
It should be noted that bias is estimation is not necessarily undesirable. It may turn out to be an asset in some situations.
If an estimator, say θ, approaches the parameter θ closer and closer as the sample size n increases, θ is said to be a consistent estimator of θ. Stating somewhat more rigorously, the estimator θ is said is be a consistent estimator of θ if, as n approaches infinity, the probability approaches 1 that θ will differ from the parameter θ by no more than an arbitrary constant.
The sample mean is an unbiased estimator of µ no matter what form the population distribution assumes, while the sample median is an unbiased estimate of µ only if the population distribution is symmetrical. The sample mean is better than the sample median as an estimate of µ in terms of both unbiasedness and consistency.
The concept of efficiency refers to the sampling variability of an estimator. If two competing estimators are both unbiased, the one with the smaller variance (for a given sample size) is said to be relatively more efficient. Stated in a somewhat different language, an estimator θ is said to be more efficient than another estimator θ2 for θ if the variance of the first is less than the variance of the second. The smaller the variance of the estimator, the more concentrated is the distribution of the estimator around the parameter being estimated and, therefore, the better this estimator is.
An estimator is said to be sufficient if it conveys much information as is possible about the parameter which is contained in the sample. The significance of sufficiency lies in the fact that if a sufficient estimator exists, it is absolutely unnecessary to considered any other estimator; a sufficient estimator ensures that all information a sample a sample can furnished with respect to the estimation of a parameter is being utilized.
Many methods have been devised for estimating parameters that may provide estimators satisfying these properties. The two important methods are the least square method and the method of maximum likelihood.
For more help in Properties of Good Estimator click the button below to submit your homework assignment