Empirical distribution function in the context of Convergence of random variables


Empirical distribution function in the context of Convergence of random variables

Empirical distribution function Study page number 1 of 1

Play TriviaQuestions Online!

or

Skip to study material about Empirical distribution function in the context of "Convergence of random variables"


⭐ Core Definition: Empirical distribution function

In statistics, an empirical distribution function (a.k.a. an empirical cumulative distribution function, eCDF) is the distribution function associated with the empirical measure of a sample. This cumulative distribution function is a step function that jumps up by 1/n at each of the n data points. Its value at any specified value of the measured variable is the fraction of observations of the measured variable that are less than or equal to the specified value.

The empirical distribution function is an estimate of the cumulative distribution function that generated the points in the sample. It converges with probability 1 to that underlying distribution, according to the Glivenko–Cantelli theorem. A number of results exist to quantify the rate of convergence of the empirical distribution function to the underlying cumulative distribution function.

↓ Menu
HINT:

In this Dossier

Empirical distribution function in the context of Realization (probability)

In probability and statistics, a realization, observation, or observed value, of a random variable is the value that is actually observed (what actually happened). The random variable itself is the process dictating how the observation comes about. Statistical quantities computed from realizations without deploying a statistical model are often called "empirical", as in empirical distribution function or empirical probability.

Conventionally, to avoid confusion, upper case letters denote random variables; the corresponding lower case letters denote their realizations.

View the full Wikipedia page for Realization (probability)
↑ Return to Menu

Empirical distribution function in the context of Glivenko–Cantelli theorem

In the theory of probability, the Glivenko–Cantelli theorem (sometimes referred to as the fundamental theorem of statistics), named after Valery Ivanovich Glivenko and Francesco Paolo Cantelli, describes the asymptotic behaviour of the empirical distribution function as the number of independent and identically distributed observations grows. Specifically, the empirical distribution function converges uniformly to the true distribution function almost surely.

The uniform convergence of more general empirical measures becomes an important property of the Glivenko–Cantelli classes of functions or sets. The Glivenko–Cantelli classes arise in Vapnik–Chervonenkis theory, with applications to machine learning. Applications can be found in econometrics making use of M-estimators.

View the full Wikipedia page for Glivenko–Cantelli theorem
↑ Return to Menu