Estimation Theory (Graduate Level)


To present the fundamental principles of Estimation theory,
To present basic estimation algorithms: LS, ML, MAP, MMSE, KF, EKF, more…

Course Outline

The course evolves from static parameter id to dynamic state estimation, from deterministic parameters (the Fisher approach) to stochastic parameters (Bayesian estimation).

Part I: Deterministic, Static Parameters

Basic ideas in estimation, examples for parameter estimation applications.
LS estimators: batch, recursive, covariance/information, the normal equations, orthogonality principle.
Statistical properties of estimators (small and large sample): biasedness, efficiency, consistency, the Cramer-Rao bound.
Best Linear Unbiased Estimator (BLUE).
Maximum Likelihood estimator (ML).

Part II: Stochastic, Static Parameters

Min Mean Square Error estimator (MMSE). The fundamental theorem of Estimation theory.
Maximum a Posteriori estimator (MAP).

Part III: Stochastic, Dynamic Parameters

Kalman filter (KF) (an Innovations-based derivation).
Extended Kalman filter (EKF) and its use for parameter identification.

Additional, Optional Subjects

Target tracking, adaptive estimation, multiple model filters, particle filters, square-root filters, more….

Assumed Background

Probability and Stochastic Processes, e.g.,

  • Davenport, Probability and Random Processes, 1970, Chapters 1–9
  • Melsa and Sage, An Introduction to Probability and Stochastic Processes, 1973, Chapters 1–5
  • Papoulis, Probability, Random Variables and Stochastic Processes, 1984, Chapters 1–9

Linear Algebra
Matrix Theory – advantage!