Consistent estimator
In statistics, a consistent estimator or asymptotically consistent estimator is an estimator—a leadership for computing estimates of a parameter θ0—having the property that as the number of data points used increases indefinitely, the resulting sequence of estimates converges in probability to θ0. This means that the distributions of the estimates become more & more concentrated nearly the true value of the argument being estimated, so that the probability of the estimator being arbitrarilyto θ0 converges to one.
In practice one constructs an estimator as a function of an usable sample of size n, as alive as then imagines being efficient to keep collecting data & expanding the sample ad infinitum. In this way one would obtain a sequence of estimates indexed by n, and consistency is a property of what occurs as the pattern size “grows to infinity”. if the sequence of estimates can be mathematically filed to converge in probability to the true proceeds θ0, it is for called a consistent estimator; otherwise the estimator is said to be inconsistent.
Consistency as defined here is sometimes mentioned to as weak consistency. When we replace convergence in probability with bias versus consistency.