Empirical likelihood
Empirical likelihood (EL) is an estimation method in statistics. Empirical likelihood estimates require fewer assumptions about the error distribution compared to similar methods like maximum likelihood. The estimation method requires that the data are independent and identically distributed (iid). It performs well even when the distribution is asymmetric or censored. EL methods can also handle constraints and prior information on parameters. Art Owen pioneered work in this area with his 1988 paper.
Estimation procedure
EL estimates are calculated by maximizing the empirical likelihood function subject to constraints based on the estimating function and the trivial assumption that the probability weights of the likelihood function sum to 1.[1] This procedure is represented:
Subject to the constraints
The value of the theta parameter can be found by solving the Lagrangian:
There is a clear analogy between this maximization problem and the one solved for maximum entropy.
See also
- Bootstrapping (statistics)
- Jackknife (statistics)
Notes
- Mittelhammer, Judge, and Miller (2000), 292.
- Bera, Y. Bilias (2002), 77.
- Bera, Y. Bilias (2002), 77.
References
- Bera, Anil K.; Bilias, Yannis (2002), "The MM, ME, ML, EL, EF and GMM approaches to estimation: a synthesis", Journal of Econometrics, 107 (1–2): 51–86, CiteSeerX 10.1.1.25.34, doi:10.1016/s0304-4076(01)00113-0.
- Mittelhammer, Ron C.; Judge, George G.; Miller, Douglas J. (2000), Econometric Foundations, Cambridge University Press, ISBN 978-0521623940.
- Owen, Art B. (1988), "Empirical likelihood ratio confidence intervals for a single functional", Biometrika, 75 (2): 237–249, doi:10.1093/biomet/75.2.237. jstor
- Owen, Art B. (2001), Empirical Likelihood, Chapman & Hall.