• Create BookmarkCreate Bookmark
  • Create Note or TagCreate Note or Tag
  • PrintPrint
Share this Page URL
Help

Chapter 12. Working with Statistical Fun... > Calculating the Variance with the VA...

Calculating the Variance with the VAR() Function

When computing the variability of a set of values, one straightforward approach would be to calculate how much each value deviates from the mean. You could then add those differences and divide by the number of values in the sample to get what might be called the “average difference.” The problem, however, is that, by definition of the arithmetic mean, adding the differences (some of which are positive and some of which are negative) gives the result 0. To solve this problem, you need to add the absolute values of the deviations and then divide by the sample size. This is what statisticians call the average deviation.

Unfortunately, this simple state of affairs is still problematic because (for highly technical reasons) mathematicians tend to shudder at equations that require absolute values. To get around this, they instead use the square of each deviation from the mean, which always results in a positive number. They sum these squares and divide by the number of values


PREVIEW

                                                                          

Not a subscriber?

Start A Free Trial


  
  • Creative Edge
  • Create BookmarkCreate Bookmark
  • Create Note or TagCreate Note or Tag
  • PrintPrint