Table of Contents
Variance and Standard Deviation
The variance of a set of data is a measure of how spread out the data is. The standard deviation is a measure of how spread out the data is, taking into account the size of the data set.
Variance Definition in Statistics
The variance is a measure of the dispersion of a set of data points around their mean. It is calculated by taking the square of the standard deviation of the data set. The variance is usually represented by the symbol σ2.
Variance Properties
The variance of a random variable measures the extent to which the random variable’s individual values differ from each other. It is calculated by taking the square of the standard deviation of the random variable. The variance is always positive, and it has no units.
The variance is used to measure the variability of a random variable’s values. It can be used to compare the variability of two different random variables, or to compare the variability of a random variable against some threshold. The variance is also used to calculate the standard deviation.
Variance Example
In the example below, assume that there are only two possible outcomes for a particular event: either the event happens or it does not. Let’s also assume that the probability of the event happening is 0.5.
The variance of the event is calculated using the following formula:
var(event) = (probability of event happening)2 * (expected value of event)
var(event) = (0.5)2 * (0) = 0
In this example, the variance of the event is 0.