So, I'm a total statistics neophyte, and need to calculate the number of standard deviations/z-score based on a confidence level, in javascript.
I have a table which provides some data:
Level: 80% 90% 95% 98% 99% z-score: 1.282 1.645 1.960 2.326 2.576
In our app, the user needs to be able to enter whatever interval they want, and we need to determine the standard deviations based on that input. I'm trying to find the math or a function to do this, but since I'm not very well versed in what I'm looking for, I'm coming up short.
Best Answer
If you're asking for a function to take a coverage like in the top row of your table and return a z-value from the bottom row of your table, you can do it from a function that returns an inverse normal cdf, also known as the normal quantile function (which this package calls the percentage point function).
Such a function doesn't do what you seem to be asking directly, but it's a simple calculation from that.
However, such a function doesn't exist in closed form; they're usually calculated via one of various accurate approximations.
With the knowledge of a few such search terms, there are a number examples readily located via use of a search engine — for example here or here. I can't speak to the correctness of these implementations, but they're easy enough to check against Excel or R (or more extensive tables, also easily found via search engine). There are also a number of more extensive stats libraries like this one – but again I can't speak for accuracy.
Anyway, once you have an inverse-normal-cdf (quantile function), let's call it Q(x) say, then evaluating
Q(1-(1-level)/2)
should give you the functionality you mention above.
Similar Posts:
- Solved – Can Z values be thought of as the number of standard deviations
- Solved – Steve Hsu’s calculation of geniuses in China
- Solved – Understanding OLS regression slope formula
- Solved – How to interpret the results of this SPSS paired samples t test
- Solved – How to interpret the results of this SPSS paired samples t test