LogitBoost

In machine learning and computational learning theory, LogitBoost is a boosting algorithm formulated by Jerome Friedman, Trevor Hastie, and Robert Tibshirani. The original paper casts the AdaBoost algorithm into a statistical framework.[1] Specifically, if one considers AdaBoost as a generalized additive model and then applies the cost function of logistic regression, one can derive the LogitBoost algorithm.

Minimizing the LogitBoost cost function

LogitBoost can be seen as a convex optimization. Specifically, given that we seek an additive model of the form

the LogitBoost algorithm minimizes the logistic loss:

gollark: They would work as more reliable long-term investments.
gollark: That and firms.
gollark: And my idea for how the buying/selling would work is that you'd create a "sell order" if you wanted to sell it, and set a price, and your share would be sold as soon as anyone created a "buy order" with that price or a higher one.
gollark: The auctioning could be done with a Vickrey auction, which apparently "gives bidders an incentive to bid their true value", which seems like a good property.
gollark: My suggested way for it to work has always been having meme shares pay dividends (based on upvotes, maybe every hour or after a fixed time or something), giving the creator some of the shares, and selling the others to "the market" (maybe via some sort of short auction mechanism?), then just letting everyone trade them freely until they pay out.

See also

References

  1. Friedman, Jerome; Hastie, Trevor; Tibshirani, Robert (2000). "Additive logistic regression: a statistical view of boosting". Annals of Statistics. 28 (2): 337–407. CiteSeerX 10.1.1.51.9525. doi:10.1214/aos/1016218223.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.