Teaching

Welcome to my teaching webpage! During my Ph.D. at the University of Cambridge, I participated in teaching the following courses:

Statistics IB

  • Introduction and probability revision
  • Estimation, bias and mean squared error
  • Sufficiency
  • Maximum Likelihood Estimator (MLE)
  • Confidence Intervals
  • Bayesian estimation
  • Simple Hypotheses
  • Composite hypotheses
  • Tests of goodness-of-fit and independence
  • Tests in contingency tables
  • Multivariate normal theory
  • The linear model
  • The normal linear model
  • Inference in the normal linear model
  • Special cases of the linear model
  • Hypothesis testing in the linear model

Principle of Statistics

  • Course overview
  • Fisher information
  • Cramer-Rao bound
  • Stochastic convergence
  • Central limit theorem
  • Consistency of the MLE
  • Asymptotic normality of MLE
  • Plug-in MLE and Delta method
  • Asymptotic inference with MLE
  • Introduction to Bayesian statistics
  • Between prior and posterior
  • Frequentist analysis of Bayesian methods
  • Decision theory & Bayesian risk
  • Minimax risk and admissibility
  • Admissibility in the Gaussian model
  • Risk of the James–Stein estimator
  • Classification problems
  • Multivariate analysis
  • Principal component analysis
  • Resampling principles & the bootstrap
  • Validity of the bootstrap
  • Monte Carlo methods
  • Markov chain Monte Carlo methods
  • Introduction to Nonparametric statistics

Mathematics of Machine Learning

  • Review of conditional expectation
  • Empirical risk minimisation
  • Sub-Gaussianity and Hoeffding’s inequality
  • Finite hypothesis classes
  • Bounded difference inequality
  • Rademacher complexity
  • VC dimension
  • Convex analysis
  • Convex surrogates
  • Rademacher complexity revisited
  • Gradient descent
  • Stochastic gradient descent
  • Cross-validation
  • Adaboost & Gradient boosting
  • Decision trees & Random forests
  • Feedforward neural networks