Fundamentals of statistical inference

proff. Bissiri, Viroli, Soffritti

  • Date:

    08 JANUARY
    -
    30 MARCH 2023
     
  • Event location: Aula IV, Via Belle Arti 41 and STAT PhD Classes Virtual Room

  • Type: Cycle 38 - Mandatory Courses

Aims: to explore a number of statistical principles. To consider the nature of statistical parameters, the different viewpoints of Bayesian and Frequentist approaches and their relationship with the given statistical principles. To introduce the idea of inference as a statistical decision problem. To introduce important aspects of statistical modelling, including model selection, various extensions to likelihood theory and its link with alternative approaches, to master asymptotic arguments.

Learning outcomes: An appreciation for the complexity of statistical inference, recognition of its inherent subjectivity and the role of expert judgement, the ability to critique familiar inference methods, knowledge of the key choices that must be made, and scepticism about apparently simple answers to difficult questions.

Final exam: written exam for each modulus.

Course contents
The course is articulated in three moduli, with final assessment for each of the moduli.

Module 1: Foundations of Statistics (prof. P.G.Bissiri)
- Statistics, Sufficiency, and Completeness
- Efficiency of estimators, optimality of tests
- The Bayesian approach, exchangeability, Dirichlet process

Module 2: Asymptotic Statistics (prof. C.Viroli)
- First-order asymptotic theory of likeholihood
- Delta Method. Variance-stabilizing transformations. Skewness reducing transformation
- Beyond the likelihood:
     * Estimating Equations, M-estimators, Z-estimators
     * Quasi likelihood
     * Pairwise likelihood
     * Empirical likelihood
- Statistical functionals
     * Extreme values: asymptotic distribution
     * Quantiles and limit distributions

Module 3: Kullback Leibler information and statistical modelling (prof. G.Soffritti)
- Statistical models. Model identification and (mis)specification.
- Kullback-Leibler information, expected log-likelihood and their estimators.
- Kullback-Leibler information as a criterion for evaluating statistical models that approximate the true distribution of the data.
- Asymptotic properties of the maximum likelihood estimator in the presence of model misspecification.

References

Module 1:
Shao, J. (2003). Mathematical Statistics. Springer
Schervish, Mark J. (1995). Theory of Statistics. Springer

Module 2:
Barndorff-Nielsen, O.E., Cox, D.R. (1989). Asymptotic Techniques for Use in Statistics. Chapman and Hall, London.
Casella, G. and Berger, R.L. (1990). Statistical Inference, Wadsworth& Brooks/Cole.
Christopher G. Small (2010). Expansions and Asymptotics for Statistics, Chapman & Hall/CRC
Pace, L. and Salvan, A. (1997) Principles of Statistical Inference from a Neo-Fisherian Perspective, Advanced Series on Statistical Science and Applied Probability, Vol. 4, World Scientific, Singapore
van der Vaart A.W. (2000). Asymptotic Statistics. CUP

Module 3:
Konishi S. and Kitagawa G. (2008). “Information criteria and statistical modeling”. Springer, New York (Chapter 3).
White H. (1982). “Maximum likelihood estimation of misspecified models”. Econometrica, Vol. 50, No. 1. (Jan. 1982), pp. 1-25.