Normalized Error Function
Contents |
Random Entry New in MathWorld MathWorld Classroom About MathWorld Contribute to MathWorld Send a Message to the Team MathWorld Book Wolfram Web Resources» 13,594 entries Last updated: Wed Oct 19 2016 Created, developed,
Complementary Error Function
and nurturedbyEricWeisstein at WolframResearch Calculus and Analysis>Special Functions>Erf> Calculus and Analysis>Complex Analysis>Entire error function calculator Functions> Interactive Entries>webMathematica Examples> More... History and Terminology>Wolfram Language Commands> MathWorld Contributors>D'Orsogna> Less... Erf is the "error function"
Error Function Table
encountered in integrating the normal distribution (which is a normalized form of the Gaussian function). It is an entire function defined by (1) Note that some authors (e.g., Whittaker inverse error function and Watson 1990, p.341) define without the leading factor of . Erf is implemented in the Wolfram Language as Erf[z]. A two-argument form giving is also implemented as Erf[z0, z1]. Erf satisfies the identities (2) (3) (4) where is erfc, the complementary error function, and is a confluent hypergeometric function of the first kind. For , (5) where is the error function matlab incomplete gamma function. Erf can also be defined as a Maclaurin series (6) (7) (OEIS A007680). Similarly, (8) (OEIS A103979 and A103980). For , may be computed from (9) (10) (OEIS A000079 and A001147; Acton 1990). For , (11) (12) Using integration by parts gives (13) (14) (15) (16) so (17) and continuing the procedure gives the asymptotic series (18) (19) (20) (OEIS A001147 and A000079). Erf has the values (21) (22) It is an odd function (23) and satisfies (24) Erf may be expressed in terms of a confluent hypergeometric function of the first kind as (25) (26) Its derivative is (27) where is a Hermite polynomial. The first derivative is (28) and the integral is (29) Min Max Re Im Erf can also be extended to the complex plane, as illustrated above. A simple integral involving erf that Wolfram Language cannot do is given by (30) (M.R.D'Orsogna, pers. comm., May 9, 2004). More complicated integrals include (31) (M.R.D'Orsogna, pers. comm., Dec.15, 2005). Erf has the continued fraction (32) (33) (Wall 1948, p.357), first state
to reliable sources. Unsourced material may be challenged and removed. (July 2012) (Learn how and when to remove this template message) In statistics and applications of statistics,
Error Function Python
normalization can have a range of meanings.[1] In the simplest cases, error function excel normalization of ratings means adjusting values measured on different scales to a notionally common scale, often prior to
Complementary Error Function Table
averaging. In more complicated cases, normalization may refer to more sophisticated adjustments where the intention is to bring the entire probability distributions of adjusted values into alignment. In the http://mathworld.wolfram.com/Erf.html case of normalization of scores in educational assessment, there may be an intention to align distributions to a normal distribution. A different approach to normalization of probability distributions is quantile normalization, where the quantiles of the different measures are brought into alignment. In another usage in statistics, normalization refers to the creation of shifted and scaled versions of https://en.wikipedia.org/wiki/Normalization_(statistics) statistics, where the intention is that these normalized values allow the comparison of corresponding normalized values for different datasets in a way that eliminates the effects of certain gross influences, as in an anomaly time series. Some types of normalization involve only a rescaling, to arrive at values relative to some size variable. In terms of levels of measurement, such ratios only make sense for ratio measurements (where ratios of measurements are meaningful), not interval measurements (where only distances are meaningful, but not ratios). In theoretical statistics, parametric normalization can often lead to pivotal quantities – functions whose sampling distribution does not depend on the parameters – and to ancillary statistics – pivotal quantities that can be computed from observations, without knowing parameters. Contents 1 Examples 2 Other types 3 See also 4 References Examples[edit] There are various normalizations in statistics – nondimensional ratios of errors, residuals, means and standard deviations, which are hence scale invariant – some of which may be summarized as follows. Note that in terms of levels of
In other words, Q(x) is the probability that a normal (Gaussian) random variable will obtain a value larger than x standard https://en.wikipedia.org/wiki/Q-function deviations above the mean. If the underlying random variable is y, https://en.wikipedia.org/wiki/Normal_distribution then the proper argument to the tail probability is derived as: x = y − μ σ {\displaystyle x={\frac {y-\mu }{\sigma }}} which expresses the number of standard deviations away from the mean. Other definitions of the Q-function, all of which are simple transformations of error function the normal cumulative distribution function, are also used occasionally.[3] Because of its relation to the cumulative distribution function of the normal distribution, the Q-function can also be expressed in terms of the error function, which is an important function in applied mathematics and physics. Contents 1 Definition and basic properties 2 Values 3 Generalization to high dimensions complementary error function 4 References Definition and basic properties[edit] Formally, the Q-function is defined as Q ( x ) = 1 2 π ∫ x ∞ exp ( − u 2 2 ) d u . {\displaystyle Q(x)={\frac {1}{\sqrt {2\pi }}}\int _{x}^{\infty }\exp \left(-{\frac {u^{2}}{2}}\right)\,du.} Thus, Q ( x ) = 1 − Q ( − x ) = 1 − Φ ( x ) , {\displaystyle Q(x)=1-Q(-x)=1-\Phi (x)\,\!,} where Φ ( x ) {\displaystyle \Phi (x)} is the cumulative distribution function of the normal Gaussian distribution. The Q-function can be expressed in terms of the error function, or the complementary error function, as[2] Q ( x ) = 1 2 ( 2 π ∫ x / 2 ∞ exp ( − t 2 ) d t ) = 1 2 − 1 2 erf ( x 2 ) -or- = 1 2 erfc ( x 2 ) . {\displaystyle {\begin{aligned}Q(x)&={\frac {1}{2}}\left({\frac {2}{\sqrt {\pi }}}\int _{x/{\sqrt {2}}}^{\infty }\exp \left(-t^{2}\right)\,dt\right)\\&={\frac {1}{2}}-{\frac {1}{2}}\operatorname {erf} \left({\frac
For other uses, see Bell curve (disambiguation). Normal distribution Probability density function The red curve is the standard normal distribution Cumulative distribution function Notation N ( μ , σ 2 ) {\displaystyle {\mathcal σ 4}(\mu ,\,\sigma ^ σ 3)} Parameters μ ∈ R — mean (location) σ2 > 0 — variance (squared scale) Support x ∈ R PDF 1 2 σ 2 π e − ( x − μ ) 2 2 σ 2 {\displaystyle {\frac σ 0{\sqrt − 9\pi }}}\,e^{-{\frac {(x-\mu )^ − 8} − 7}}}} CDF 1 2 [ 1 + erf ( x − μ σ 2 ) ] {\displaystyle {\frac − 2 − 1}\left[1+\operatorname − 0 \left({\frac 9{\sigma {\sqrt 8}}}\right)\right]} Quantile μ + σ 2 erf − 1 ( 2 F − 1 ) {\displaystyle \mu +\sigma {\sqrt 2}\operatorname 1 ^{-1}(2F-1)} Mean μ Median μ Mode μ Variance σ 2 {\displaystyle \sigma ^ − 8\,} Skewness 0 Ex. kurtosis 0 Entropy 1 2 ln ( 2 σ 2 π e ) {\displaystyle {\tfrac − 6 − 5}\ln(2\sigma ^ − 4\pi \,e\,)} MGF exp { μ t + 1 2 σ 2 t 2 } {\displaystyle \exp\{\mu t+{\frac − 0 σ 9}\sigma ^ σ 8t^ σ 7\}} CF exp { i μ t − 1 2 σ 2 t 2 } {\displaystyle \exp\ σ 2 σ 1}\sigma ^ σ 0t^ μ 9\}} Fisher information ( 1 / σ 2 0 0 1 / ( 2 σ 4 ) ) {\displaystyle {\begin μ 41/\sigma ^ μ 3&0\\0&1/(2\sigma ^ μ 2)\end μ 1}} In probability theory, the normal (or Gaussian) distribution is a very common continuous probability distribution. Normal distributions are important in statistics and are often used in the natural and social sciences to represent real-valued random variables whose distributions are not known.[1][2] The normal distribution is useful because of the central limit theorem. In its most general form, under some conditions (which include finite variance), it states that averages of random variables independently drawn from independent distributions converge in distribution to the normal, that is, become normally distributed when the number of random variables is sufficiently large. Physical quantities that are expected to be the sum of many independent processes (such as measurement errors) often have distributions that are nearly normal.[3] Moreover, many results and methods (such as propagation of uncertainty and least squares parameter fitting) can be derived analytically in explicit form when the relevant variables are normally distributed. The normal distribution is sometimes informally called the bell curve. However, many other distributions are bell-shaped (such as the Cauchy, Student'