Mean signed deviation

In statistics, the mean signed difference, deviation, or error (MSD) is a sample statistic that summarises how well a set of estimates match the quantities that they are supposed to estimate. It is one of a number of statistics that can be used to assess an estimation procedure, and it would often be used in conjunction with a sample version of the mean square error.

For example, suppose a linear regression model has been estimated over a sample of data, and is then used to extrapolate predictions of the dependent variable out of sample after the out-of-sample data points have become available. Then would be the i-th out-of-sample value of the dependent variable, and would be its predicted value. The mean signed deviation is the average value of

Definition

The mean signed difference is derived from a set of n pairs, , where is an estimate of the parameter in a case where it is known that . In many applications, all the quantities will share a common value. When applied to forecasting in a time series analysis context, a forecasting procedure might be evaluated using the mean signed difference, with being the predicted value of a series at a given lead time and being the value of the series eventually observed for that time-point. The mean signed difference is defined to be

gollark: I got as far as making it not immediately crash when adding a wallet, but I can't mine on it.
gollark: I've decided to just give up on running a krist node, it's aaaaaaaargh.
gollark: Weird.
gollark: Downgrade `pg` to 6.4.1.
gollark: Ah, that's fixable. I managed it.

See also

This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.