When you can measure what you are speaking about and express it in numbers you know something about it; but when you cannot measure it, when you cannot express it in numbers, your knowledge is of an unsatisfactory kind - Kelvin (1824-1907)

  • Syllabus
  • Al's email, Office Phone: 994-5145, Office: Barnard (EPS) 304, Office Hours and Schedule.
  • Useful links:
  • Exams:
  • Projects and HWs:
  • Course Schedule:
    • 4/28 Course Review
    • 4/26 §10.11 Likelihood Ratio Tests
    • 4/24 §10.10 Uniformly Most Powerful tests, approximate large sample most powerful tests

    • 4/21 §10.10 Powerful tests and the Neyman Pearson Lemma
    • 4/19 §10.9 Hypothesis tests of two population variances
    • 4/17 §10.5, 10.9 Hypothesis testing with CIs; Testing the variance from a single population
    • 4/14 NO CLASS
    • 4/12 §10.4 Sample Size calculations
    • 4/10 §16.5 Issues with "non-informative" priors.   Using Markov Chain Monte Carlo for drawing posterior samples

    • 4/7 §10.3, 10.8 Hypothesis testing with asymptotically normal estimators, one-sample t-test
    • 4/5 §10.4, 10.6 Hypothesis testing with p-values and rejection regions, one-sample test of proportions, Type I and Type II errors
    • 4/3 Exam 2

    • 3/31 Review
    • 3/29 §10.1-2 Hypothesis Testing and the Scientific method
    • 3/27 §16.3 Bayesian interval estimators (credible or probability intervals), Bayesian posteriors for a normal mean and variance

    • 3/24 §16.2 Using a posterior from a previous data analysis as a prior in a new data analysis.
    • 3/22 §16.2 Conjugate priors and Bayesian point estimators.
    • 3/20§16.1, 16.5 Posterior and prior information.  "Non-informative" flat priors.   And who needs marginals anyways?  Chapter 16 Notes

      SPRING BREAK!!!!
    • 3/10 §16.1 Bayes rule
    • 3/8 §9.8 Cramer-Rao lower bound for MLE variance, CLT for MLEs
    • 3/6 §9.7 Beautiful properties of MLEs: sufficiency, MVUE, consistency, invariance

    • 3/3 §9.7 MLE examples: Geometric and Poisson.
    • 3/1 §9.6-7 Method of moments (MOM), Method of maximum likelihood (MLE)
    • 2/27 §9.5 Rao-Blackwell Theorem, MVUE

    • 2/24 §9.4 Sufficiency, Factorization Theorem, likelihood function
    • 2/22 §9.3 Consistent point estimators for the population mean, variance and standard devation; Stutsky's Theorem
    • 2/20 PRESIDENTS DAY

    • 2/17 Exam 1
    • 2/15 Review
    • 2/13 §9.3 Consistency, General Weak Law of Large Numbers
    • 2/10 §8.9, 9.2 CI for Variance example.  Efficiency. Review of Chebyshev's Theorem (Thm 4.13).  Calculating t and F CIs in R: Alligator means; chi-square CIs in R: Biofilm repeatability.
    • 2/8 §8.9 Two-sample CI for a difference in means assuming equal variances, CI for a population variance. 
    • 2/6 §8.7-8 Interpreting CIs!   Sample size calculations; CIs for parameters whose estimators are (asymptotically) normal and the variance is UNknown, two-sample CI for difference in means
    • 2/3 §8.5-8.6 Pivotal method, CIs for parameters whose estimators are (asymptotically) normal and the variance is known
    • 2/1 §8.5 Confidence intervals
    • 1/30 §8.1-8.4 point and interval estimators, bias, MSE, MVUEs
    • 1/27 §7.4 Proof of the Central Limit Theorem
    • 1/25 §7.2 Sampling distributions: F distribution
    • 1/23 §7.2 Sampling distributions: chi-square, t
    • 1/20 §7.5 Normal approximation of the binomial
    • 1/18 §7.2-7.3 Mean, Variance and Sampling Distribution of the sample mean; Central Limit Theorem
    • 1/16 §7.1 Statistics and Sampling Distributions, Chapter 7 Notes
    • 1/13 Tying together STAT421 and STAT422 with the Scientific Method
    • 1/11 Welcome!   Remember the Scientific Method?