Additive model

In statistics, an additive model (AM) is a nonparametric regression method. It was suggested by Jerome H. Friedman and Werner Stuetzle (1981)[1] and is an essential part of the ACE algorithm. The AM uses a one-dimensional smoother to build a restricted class of nonparametric regression models. Because of this, it is less affected by the curse of dimensionality than e.g. a p-dimensional smoother. Furthermore, the AM is more flexible than a standard linear model, while being more interpretable than a general regression surface at the cost of approximation errors. Problems with AM include model selection, overfitting, and multicollinearity.

Description

Given a data set of n statistical units, where represent predictors and is the outcome, the additive model takes the form

or

Where , and . The functions are unknown smooth functions fit from the data. Fitting the AM (i.e. the functions ) can be done using the backfitting algorithm proposed by Andreas Buja, Trevor Hastie and Robert Tibshirani (1989).[2]

gollark: Please cease "OS" production.
gollark: Please don't.
gollark: I can help with webdesigny stuff somewhat.
gollark: Most commonly used programming languages are pretty similar conceptually and you just need to be good at the basic skills of translating what you mean into simple steps computers can actually run.
gollark: Although I'm imagining that many programs would run background tasks then totally fail to cleanup them.

See also

References

  1. Friedman, J.H. and Stuetzle, W. (1981). "Projection Pursuit Regression", Journal of the American Statistical Association 76:817823. doi:10.1080/01621459.1981.10477729
  2. Buja, A., Hastie, T., and Tibshirani, R. (1989). "Linear Smoothers and Additive Models", The Annals of Statistics 17(2):453555. JSTOR 2241560

Further reading

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.