Z = (score-mean)/SD
You would have the Z score for each raw score, stating their value in terms of standard deviations from the mean.
You would have the Z score for each raw score, stating their value in terms of standard deviations from the mean.
Standardization is a way to measure how far away a particular data point is from the mean, in terms of standard deviations. It allows us to compare different values from different distributions.
In this case, we have an IQ distribution with a mean of 100 and a standard deviation of 16. Let's say we have an IQ score of 120. To standardize this score, we subtract the mean (100) and divide by the standard deviation (16).
Standardized score = (IQ score - Mean) / Standard deviation
Standardized score = (120 - 100) / 16
= 20 / 16
= 1.25
So, if we first subtract the mean (100) from each score and then divide by the standard deviation (16) for an IQ score of 120, the result would be a standardized score of 1.25. This indicates that the IQ score of 120 is 1.25 standard deviations above the mean.
To calculate the standardized score for each IQ test score, we can use the formula:
Standardized Score = (Raw Score - Mean) / Standard Deviation
In this case, the mean is 100 and the standard deviation is 16. So, the formula becomes:
Standardized Score = (Raw Score - 100) / 16
By plugging in the values, you can calculate the standardized score for each IQ test score.