Confidence Interval Standard Deviation Or Standard Error
Contents |
CFA Program CFA Forums CFA General Discussion CFA Level I Forum CFA Level II Forum CFA Level III Forum CFA Hook Up CAIA More in
Confidence Interval For Standard Deviation Ti 84
CAIA CAIA Test Prep CAIA Events CAIA Links About the CAIA Program FRM confidence interval for standard deviation excel More in FRM FRM Test Prep FRM Events FRM Links About the FRM Program Careers Investments Water Cooler Test Prep Test confidence interval for standard deviation formula Prep Sections CFA Test Prep CAIA Test Prep FRM Test Prep Calendar AF Deals CFA Test Prep CFA Events CFA Links About the CFA Program Home Forums CFA Forums CFA Level I Forum Confidence https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1255808/ interval: standard error vs standard deviation Tweet Widget Google Plus One Linkedin Share Button Facebook Like Last post heathcliff101 Jan 9th, 2013 10:55am CFA Level I Candidate 31 AF Points Studying With Hi guys! When constructing a confidence interval, what determines whether we use the standard error or the standard deviation in the formula? What both terms are, I just struggle to choose the correct one when answering http://www.analystforum.com/forums/cfa-forums/cfa-level-i-forum/91317258 questions. Any help would be appreciated! Heathcliff Save up to $200 on 2016 Level I CFA® Exam Review Live Online Classes, Lecture Videos, Study Guides, Practice Questions, Mocks and more. Learn more Share this Facebook Like Google Plus One Linkedin Share Button Tweet Widget VWJETTY Jan 9th, 2013 11:38am CFA Level III Candidate 451 AF Points Studying With depends on the information they give you. When you're not practicing, someone else is getting better. - Allen Iverson cfageist Jan 9th, 2013 11:56am CFA Level I Candidate 107 AF Points You always use the standard error to calculate confidence intervals. The standard error equals s/squareroot(n). In the standard error formula, s refers to the estimate of the population standard deviation. and n the sample size. Maggie88 Jan 10th, 2013 3:59pm CFA Passed Level I 42 AF Points Studying With Standard error it is. sj.1802 Jan 11th, 2013 4:51am CFA Passed Level II 48 AF Points Always use the standard error.. heathcliff101 Jan 11th, 2013 6:01am CFA Level I Candidate 31 AF Points Studying With Thanks for your responses guys! Aether Jan 14th, 2013 3:12pm CFA Charterholder 677 AF Points Not to confuse anyone, but be careful on the exam: if they supply bot
Tour Start here for a quick overview of the site Help Center Detailed answers to any questions you might have Meta Discuss the workings and policies of this site About Us Learn more about Stack Overflow the company Business Learn more about hiring developers or posting ads with us Cross Validated Questions Tags Users Badges Unanswered Ask Question _ Cross Validated is a question and answer site for people interested in statistics, machine learning, data analysis, data mining, and data visualization. Join them; it only takes a minute: Sign up Here's how it works: Anybody can ask a question Anybody can answer The best answers are voted up and rise to the top Difference between standard error and standard deviation up vote 59 down vote favorite 30 I'm struggling to understand the difference between the standard error and the standard deviation. How are they different and why do you need to measure the standard error? mean standard-deviation standard-error basic-concepts share|improve this question edited Aug 9 '15 at 18:41 gung 73.5k19159306 asked Jul 15 '12 at 10:21 louis xie 413166 4 A quick comment, not an answer since two useful ones are already present: standard deviation is a property of the (distribution of the) random variable(s). Standard error is instead related to a measurement on a specific sample. The two can get confused when blurring the distinction between the universe and your sample. –Francesco Jul 15 '12 at 16:57 Possibly of interest: stats.stackexchange.com/questions/15505/… –Macro Jul 16 '12 at 16:24 add a comment| 4 Answers 4 active oldest votes up vote 13 down vote accepted To complete the answer to the question, ocram nicely addressed standard error but did not contrast it to standard deviation and did not mention the dependence on sample size. As a special case for the estimator consider the sample mean. The standard error for the mean is $\sigma \, / \, \sqrt{n}$ where $\sigma$ is the population standard deviation. So in this example we see explicitly how the standard error decreases with increasing sample size. The standard deviation is most often used to refer to the individual observations. So standard deviation describes the variability of the individual observations while standard error shows the variability of the estimator. Good estimators are consistent which means