Answered

If a set of scores has a mean of 100.00 and
a standard deviation of 5.00, what is the
variance of the standard scores?

Answer :

mberisso

Answer:

Variance is 25

Explanation:

Recall that the standard deviation is defined as the square root of the variance. therefore, if you know the standard deviation [tex](\sigma)[/tex], square it and you get the variance:

[tex]Variance= \sigma^2\\Variance= 5^2= 25[/tex]

Other Questions