11.2: B - Mathematical Phrases, Symbols, and Formulas
English Phrases Written Mathematically
| When the English says: | Interpret this as: |
|---|---|
|
\(X\) is at least 4.
|
\(X \geq 4\) |
|
\(X\) is at most 4.
The maximum of \(X\) is 4. \(X\) is no more than 4. \(X\) is less than or equal to 4. \(X\) does not exceed 4. |
\(X \leq 4\) |
|
\(X\) is greater than 4.
\(X\) is more than 4. \(X\) exceeds 4. |
\(X > 4\) |
| \(X\) is less than 4. | \(X < 4\) |
|
\(X\) is 4.
\(X\) is equal to 4. \(X\) is the same as to 4. |
\(X = 4\) |
|
\(X\) is not 4.
\(X\) is not equal to 4. \(X\) is not the same as 4. \(X\) is different than 4. |
\(X \neq 4\) |
Symbols and Their Meanings
| Chapter (1st used) | Symbol | Spoken | Meaning |
|---|---|---|---|
| Sampling and Data | \(\sqrt{ } \) | The square root of | same |
| Descriptive Statistics | \(Q_1\) | quartile one | the first quartile |
| Descriptive Statistics | \(Q_2\) | quartile two | the second quartile |
| Descriptive Statistics | \(Q_3\) | quartile three | the third quartile |
| Descriptive Statistics | \(IQR\) | interquartile range | \(Q_3 – Q_1 = IQR\) |
| Descriptive Statistics | \(\overline x\) | \(x\)-bar | sample mean |
| Descriptive Statistics | \(\mu\) | mu | population mean |
| Descriptive Statistics | \(s\) | \(s\) | sample standard deviation |
| Descriptive Statistics | \(s^2\) | \(s\) squared | sample variance |
| Descriptive Statistics | \(\sigma\) | sigma | population standard deviation |
| Descriptive Statistics | \(\sigma^2\) | sigma squared | population variance |
| Descriptive Statistics | \(\Sigma\) | capital sigma | sum |
| Probability Topics | \(\{ \}\) | brackets | set notation |
| Probability Topics | \(S\) | \(S\) | sample space |
| Probability Topics | \(A\) | event \(A\) | event \(A\) |
| Probability Topics | \(P(A)\) | probability of \(A\) | probability of \(A\) occurring |
| Probability Topics | \(P(A|B)\) | probability of \(A\) given \(B\) | probability of \(A\) occurring given \(B\) has occurred |
| Probability Topics | \(P(A\cup B)\) | probability of \(A\) or \(B\) | probability of \(A\) or \(B\) or both occurring |
| Probability Topics | \(P(A\cap B)\) | probability of \(A\) and \(B\) | probability of both \(A\) and \(B\) occurring (same time) |
| Probability Topics | \(A^{\prime}\) | \(A\)-prime; complement of \(A\) | complement of \(A\); not \(A\) |
| Probability Topics | \(P(A^{\prime})\) | probability of the complement of \(A\) | same |
| Probability Topics | \(G_1\) | green on first pick | same |
| Probability Topics | \(P(G_1)\) | probability of green on first pick | same |
| The Normal Distribution | \(N\) | normal distribution | same |
| The Normal Distribution | \(z\) | \(z\)-score | same |
| The Normal Distribution | \(Z\) | standard normal distribution | same |
| The Central Limit Theorem | \(\overline x\) | \(x\)-bar | the random variable \(x\)-bar |
| The Central Limit Theorem | \(\mu_{\overline{x}}\) | mean of \(x\)-bars | the average of \(x\)-bars |
| The Central Limit Theorem | \(\sigma_{\overline{x}}\) | standard deviation of \(x\)-bars | same |
| Confidence Intervals | \(CL\) | confidence level | same |
| Confidence Intervals | \(CI\) | confidence interval | same |
| Confidence Intervals | \(EBM\) | error bound for a mean | same |
| Confidence Intervals | \(EBP\) | error bound for a proportion | same |
| Confidence Intervals | \(t\) | Student's \(t\)-distribution | same |
| Confidence Intervals | \(df\) | degrees of freedom | same |
| Confidence Intervals | \(t_{\frac{\alpha}{2}}\) | Student's \(t\) with \(\alpha\)/2 area in each tail | same |
| Confidence Intervals | \(P^{\prime}\) | \(P\)-prime | sample proportion of success or interest |
| Hypothesis Testing | \(H_0\) | \(H\)-naught, \(H\)-sub-0 | null hypothesis |
| Hypothesis Testing | \(H_a\) | \(H\)-a, \(H\)-sub a | alternative (or research) hypothesis |
| Hypothesis Testing | \(H_1\) | \(H\)-1, \(H\)-sub 1 | alternative (or research) hypothesis |
| Hypothesis Testing | \(\alpha\) | alpha | probability of Type I error |
| Hypothesis Testing | \(\beta\) | beta | probability of Type II error |
| Hypothesis Testing | \(\overline{x}_1-\overline{x}_2\) | \(x\)1-bar minus \(x\)2-bar | difference in sample means |
| Hypothesis Testing | \(\mu_{1}-\mu_{2}\) | mu-1 minus mu-2 | difference in population means |
| Hypothesis Testing | \(P_{1}^{\prime}-P_{2}^{\prime}\) | \(P\)1-prime minus \(P\)2-prime | difference in sample proportions |
| Hypothesis Testing | \(P_{1}-P_{2}\) | \(P\)1 minus \(P\)2 | difference in population proportions |
| Linear Regression and Correlation | \(Y = a + bX\) | \(Y\) equals \(a\) plus \(b\)-\(X\) | equation of a straight line |
| Linear Regression and Correlation | \(\hat Y\) | \(Y\)-hat | estimated value of \(Y\) |
| Linear Regression and Correlation | \(r\) | sample correlation coefficient | same |
| Linear Regression and Correlation | \(\varepsilon\) | error term for a regression line | same |
| Linear Regression and Correlation | \(SSE\) | Sum of Squared Errors | same |
| F -Distribution and ANOVA | \(F\) | \(F\)-ratio | \(F\)-ratio |
Formulas
| Symbols you must know | ||
| Population | Sample | |
| \(N\) | Size | \(n\) |
| \(\mu\) | Mean | \(\overline x\) |
| \(\sigma^2\) | Variance | \(s^2\) |
| \(\sigma\) | Standard deviation | \(s\) |
| \(P\) | Proportion | \(P^{\prime}\) |
| Single data set formulae | ||
| Population | Sample | |
| \(Q_{3}=\frac{3(N+1)}{4}, Q_{1}=\frac{(N+1)}{4}\) |
Inter-quartile range
\(I Q R=Q_{3}-Q_{1}\) |
\(Q_{3}=\frac{3(n+1)}{4}, Q_{1}=\frac{(n+1)}{4}\) |
| \(\sigma^{2}=\frac{1}{N} \sum_{i=1}^{N}\left(x_{i}-\mu\right)^{2}\) | Variance | \(s^{2}=\frac{1}{n-1} \sum_{i=1}^{n}\left(x_{i}-\overline{x}\right)^{2}\) |
| \(\sigma^{2}=\frac{1}{N} \sum_{i=1}^{N}\left(x_{i}-\mu\right)^{2} \cdot f_{i}\) | Variance | \(s^{2}=\frac{1}{n-1} \sum_{i=1}^{n}\left(x_{i}-\overline{x}\right)^{2} \cdot f_{i}\) |
| Basic probability rules | |||
| \(P(A \cap B)=P(A | B) \cdot P(B)\) | Multiplication rule | ||
| \(P(A \cup B)=P(A)+P(B)-P(A \cap B)\) | Addition rule | ||
| \(P(A \cap B)=P(A) \cdot P(B) \text { or } P(A | B)=P(A)\) | Independence test |
| The following formulae require the use of the \(z\), \(t\), or \(F\) tables. | ||
| \(z=\frac{x-\mu}{\sigma}\) | z -transformation for normal distribution | |
| Test statistics |
Confidence intervals
[bracketed symbols equal margin of error] (subscripts denote locations on respective distribution tables) |
|
| \(z_{obs}=\frac{\overline{x}-\mu_{0}}{\frac{\sigma}{\sqrt{n}}}\) |
Interval for the population mean when sigma is known
\(\overline{x} \pm\left[z_{(\alpha / 2)} \frac{\sigma}{\sqrt{n}}\right]\) |
|
| \(z_{obs}=\frac{\overline{x}-\mu_{0}}{\frac{s}{\sqrt{n}}}\) |
Interval for the population mean when sigma is unknown and
\(n > 100\)
\(\overline{x} \pm\left[z_{(\alpha / 2)} \frac{s}{\sqrt{n}}\right]\) |
|
| \(t_{obs}=\frac{\overline{x}-\mu_{0}}{\frac{s}{\sqrt{n}}}\) |
Interval for the population mean when sigma is unknown and
\(n < 100\)
\(\overline{x} \pm\left[t_{(n-1),(\alpha / 2)} \frac{s}{\sqrt{n}}\right]\) |
|
| \(z_{obs}=\frac{P^{\prime}-P_0}{\sqrt{\frac{P_0 (1-P_0)}{n}}}\) |
Interval for the population proportion
\(P^{\prime} \pm\left[z_{(\alpha / 2)} \sqrt{\frac{P^{\prime} \left(1-P^{\prime}\right)}{n}}\right]\) |
|
| \(t_{obs}=\frac{\bar{x}_d-\mu_{d}}{\frac{s_{d}}{\sqrt{n}}}\) |
Interval for difference between two means with matched pairs
\(\bar{x}_d \pm\left[t_{(n-1),(\alpha / 2)} \frac{s_{d}}{\sqrt{n}}\right]\) where \(s_d\) is the deviation of the differences |
|
| \(z_{obs}=\frac{\left(\overline{x}_1-\overline{x}_2\right)-\left({\mu_{1}}-{\mu_{2}}\right)}{\sqrt{\frac{s_{1}^{2}}{n_{1}}+\frac{s_{2}^{2}}{n_{2}}}}\) |
Interval for difference between two independent means when
\(n > 100\)
\(\left(\overline{x}_{1}-\overline{x}_{2}\right) \pm\left[z_{(\alpha / 2)} \sqrt{\frac{s_{1}^{2}}{n_{1}}+\frac{s_{2}^{2}}{n_{2}}}\right]\) |
|
| \(z_{obs}=\frac{\left(\overline{x}_1-\overline{x}_2\right)-\left({\mu_{1}}-{\mu_{2}}\right)}{\sqrt{\frac{s_{1}^{2}}{n_{1}}+\frac{s_{2}^{2}}{n_{2}}}}\) |
Interval for difference between two independent means when
\(n < 100\)
\(\left(\overline{x}_{1}-\overline{x}_{2}\right) \pm\left[t_{(n_1+n_2-2),(\alpha / 2)} \sqrt{\frac{s_{1}^{2}}{n_{1}}+\frac{s_{2}^{2}}{n_{2}}}\right]\) |
|
|
Interval for difference between two population proportions
\(\left(P_{1}^{\prime}-P_{2}^{\prime}\right) \pm\left[z_{(\alpha / 2)} \sqrt{\frac{P_{1}^{\prime}\left(1-P_{1}^{\prime}\right)}{n_{1}}+\frac{P_{2}^{\prime}\left(1-P_{2}^{\prime}\right)}{n_{2}}}\right]\) |
|
Simple linear regression formulae for \(Y=a+b(X)\) |
|
|
\[r_{X Y}=\frac{\sum (X_{i}-\overline{X})*(Y_{i}-\overline{Y})}{\sqrt{\sum (X_{i}-\overline{X})^{2}*\sum (Y_{i}-\overline{Y})^{2}}}\nonumber\] \[r_{X Y}=\frac{\sum X_{i} Y_{i}-\frac{\left(\sum X_{i}\right)\left(\sum Y_{i}\right)}{n}}{\sqrt{\left[\sum X_{i}^{2}-\frac{\left(\sum X_{i}\right)^{2}}{n}\right]*\left[\sum Y_{i}^{2}-\frac{\left(\sum Y_{i}\right)^{2}}{n}\right]}}\nonumber\] |
Correlation coefficient |
|
\[b_{1}=\frac{\Sigma(X_{i}-\overline{X})(Y_{i}-\overline{Y})}{\Sigma(X_{i}-\overline{X})^{2}}\nonumber\] \[b_{1}=\frac{\sum X_{i} Y_{i}-\frac{\left(\sum X_{i}\right)\left(\sum Y_{i}\right)}{n}}{\sum X_{i}^{2}-\frac{\left(\sum X_{i}\right)^{2}}{n}}\nonumber\] \[b_{1}=r_{X Y}\left(\frac{s_{Y}}{s_{X}}\right)\nonumber\] |
Coefficient \(b\) (or \(b_1\), slope) |
|
\[b_{0}=\overline{Y}-b_{1} \overline{X}\nonumber\] |
\(Y\)-intercept (\(a\), or \(b_0\)) |
|
\(s_{e}^{2}=\frac{\Sigma\left(Y_{i}-\hat{Y}_{i}\right)^{2}}{n-k}=\frac{\sum_{i=1}^{n} e_{i}^{2}}{n-k}\) |
Estimate of the error variance |
|
\(s_{b}=\frac{s_{e}^{2}}{\sqrt{\left(X_{i}-\overline{X}\right)^{2}}}=\frac{s_{e}^{2}}{(n-1) s_{X}^{2}}\) |
Standard error for coefficient \(b\) |
|
\(t_{obs}=\frac{b-\beta_{0}}{s_b}\) |
Hypothesis test for coefficient \(\beta\) |
|
\(b \pm\left[t_{n-2, \alpha / 2} s_{b}\right]\) |
Interval for coefficient \(\beta\) |
|
\(\hat{Y} \pm\left[t_{\alpha / 2} * s_{e}\left(\sqrt{\frac{1}{n}+\frac{\left(X_{p}-\overline{X}\right)^{2}}{s_{X}}}\right)\right]\) |
Interval for expected value of \(Y\) |
|
\(\hat{Y} \pm\left[t_{\alpha / 2} * s_{e}\left(\sqrt{1+\frac{1}{n}+\frac{\left(X_{p}-\overline{X}\right)^{2}}{s_{X}}}\right)\right]\) |
Prediction interval for an individual \(Y\) |
|
ANOVA formulae |
|
|
\(SS_R=n_1\left(\bar{x}_{1}-\bar{x}\right)^2+\cdots+n_g\left(\bar{x}_{g}-\bar{x}\right)^2\) |
Sum of squares regression |
|
\(SS_E=\left(n_1-1\right)s_1^2+\cdots+\left(n_g-1\right)s_g^2\) |
Sum of squares error |
|
\(SS_T=SS_R + SS_E\) |
Sum of squares total |
|
\(R^{2}=\frac{SS_R}{SS_T}\) |
Coefficient of determination |
| The following is the breakdown of a one-way ANOVA table for linear regression. | ||||
| Source of variation | Sum of squares | Degrees of freedom | Mean squares | \(F\)-ratio |
| Regression |
\(n_1\left(\bar{x}_{1}-\bar{x}\right)^2+\cdots+n_g\left(\bar{x}_{g}-\bar{x}\right)^2\) |
\(1\) or \(g−1\) | \(M S R=\frac{S S_R}{d f_{R}}\) | \(F=\frac{M S_R}{M S_E}\) |
| Error |
\(\left(n_1-1\right)s_1^2+\cdots+\left(n_g-1\right)s_g^2\) |
\(n-g\) | \(M S E=\frac{S S_E}{d f_{E}}\) | |
| Total | \(SS_R + SS_E\) | \(n−1\) |