1 / 16

Lord of the Cram é r-Rao Bound

Lord of the Cram é r-Rao Bound. ECE 7251: Spring 2004 Lecture 9 1/28/04. Prof. Aaron D. Lanterman School of Electrical & Computer Engineering Georgia Institute of Technology AL: 404-385-2548 <lanterma@ece.gatech.edu>. Quest of the Cram é r-Rao Bound. One bound to beat them all,

Download Presentation

Lord of the Cram é r-Rao Bound

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Lord of the Cramér-Rao Bound ECE 7251: Spring 2004 Lecture 9 1/28/04 Prof. Aaron D. Lanterman School of Electrical & Computer Engineering Georgia Institute of Technology AL: 404-385-2548 <lanterma@ece.gatech.edu>

  2. Quest of the Cramér-Rao Bound One bound to beat them all, and in confusion bind them… In the land of Techwood where the Engineers lie. • Our Quest: Find an ultimate performance bound which may not be beaten by any estimator

  3. Take derivative w.r.t.  on both sides: Should check if interchange is legit; need derivative of density to be “Absolutely integrable” Fellowship of the CR Bound • Supposed the estimator is unbiased. Then

  4. =1 Breakfast of the CR Bound

  5. Use the Schwarz, Lone Star! Lunch of the CR Bound

  6. (Since estimator is unbiased) Fisher Information Dinner of the CR Bound

  7. Take derivative w.r.t.  on both sides: • Take derivative again (assuming we can!): The Two Fisher Information Towers (1)

  8. Different notations for Fisher information: The Two Fisher Information Towers (2)

  9. Other Terminology • Observed or empirical Fisher information: A specific collected data set • Stochastic Fisher information: A random variable • Fisher information is expected value of stochastic Fisher information

  10. Matrix CR Bound • Elements of the Fisher information matrix: Average “curvature” of loglikelihood near  • Matrix CR Bound for unbiased estimators: Means is nonnegative definite

  11. Nonnegative Definite? Huh??? • If A is nonnegative definite, then: • zTAz0 for any real vector z • All eigenvalues of A are nonnegative • Useful consequences: if AB, • Diagonals dominated: AiiBii • But does not mean ArcBrc in general! • Total sum property:

  12. Efficiency • An unbiased estimator which achieves the CR Bound with equality is called efficient • Efficient estimators are UMVE (but not necessarily the other way around) • Efficient estimators can only exist if the density comes from an exponential family! • Proof uses condition for equality in the Schwarz inequality (1-D version)

  13. Some Properties of ML Estimators • If an efficient estimator exists, then the ML estimator is it! • Suppose we have n independent samples • ML estimators are asymptotically efficient • ML estimators areasymptotically normal: Formally: Fisher information matrix for one data sample

  14. Return of the Biased Estimator • CR bound for biased estimator: • Proof: Poor, pp. 169-171. • There are extensions for multiparameter case • Bias often difficult to compute analytically for a particular estimator; hence, the unbiased CR bound is often given anyway!

  15. What About Those Bayesians? • A related bound on the MSE: • Note expectations now over Y and • Easily extended to multivariate case • Not all authors cover! • Hero and Poor do not • Van Trees (pp. 72-73, 74-85) and Srinath (pp. 163) do

  16. Interpretation of the Bayesian Bound • A Bayesian analogy of the Fisher information matrix Note not a func. of  • Info from data adds with info from prior • To be “efficient” (achieve bound with equality) in the Bayesian setting, posterior density must be Gaussian! • Proof uses condition of equality in Schwarz ineq. • Stronger requirement than in nonrandom setting

More Related