Mean square

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

In mathematics and its applications, the mean square is normally defined as the arithmetic mean of the squares of a set of numbers or of a random variable.[1]

It may also be defined as the arithmetic mean of the squares of the deviations between a set of numbers and a reference value (e.g., may be a mean or an assumed mean of the data),[2] in which case it may be known as mean square deviation. When the reference value is the assumed true value, the result is known as mean squared error.

A typical estimate for the sample variance from a set of sample values uses a divisor of the number of values minus one, n-1, rather than n as in a simple quadratic mean, and this is still called the "mean square" (e.g. in analysis of variance):

The second moment of a random variable, is also called the mean square. The square root of a mean square is known as the root mean square (RMS or rms), and can be used as an estimate of the standard deviation of a random variable.

References

[edit]
  1. ^ "Noise and Noise Rejection" (PDF). engineering.purdue.edu/ME365/Textbook/chapter11. Retrieved 6 January 2020.
  2. ^ "OECD Glossary of Statistical Terms". oecd.org. Retrieved 6 January 2020.