1 / 162

Inductive Amnesia

Inductive Amnesia. The Reliability of Iterated Belief Revision. Even Odd Straight Crooked Reliability “Confirmation” Performance “Primitive norms” Correctness “Coherence” Classical statistics Bayesianism Learning theory Belief Revision Theory. A Table of Opposites.

sancha
Download Presentation

Inductive Amnesia

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Inductive Amnesia The Reliability of Iterated Belief Revision

  2. Even Odd Straight Crooked Reliability “Confirmation” Performance “Primitive norms” Correctness “Coherence” Classical statistics Bayesianism Learning theory Belief Revision Theory A Table of Opposites

  3. The Idea • Belief revision is inductive reasoning • A restrictive norm prevents us from finding truths we could have found by other means • Some proposed belief revision methods are restrictive • The restrictiveness is expressed as inductive amnesia

  4. Inductive Amnesia • No restriction on memory... • No restriction on predictive power... • But prediction causes memory loss... • And perfect memory precludes prediction! • Fundamental dilemma

  5. Outline • I. Seven belief revision methods • II. Belief revision as learning • III. Properties of the methods • IV. The Goodman hierarchy • V. Negative results • VI. Positive results • VII. Discussion

  6. Points of Interest • Strong negative and positive results • Short run advice from limiting analysis • 2 is magic for reliable belief revision • Learning as cube rotation • Grue

  7. Part I Iterated Belief Revision

  8. Bayesian (Vanilla) Updating B Propositions are sets of “possible worlds”

  9. Bayesian (Vanilla) Updating E new evidence B

  10. Bayesian (Vanilla) Updating • Perfect memory • No inductive leaps E B B’ B’ = B *E = BÇE

  11. Epistemic Hell B

  12. Epistemic Hell E Surprise! B Epistemic hell

  13. Epistemic Hell • Scientific revolutions • Suppositional reasoning • Conditional pragmatics • Decision theory • Game theory • Data bases E B Epistemic hell

  14. Ordinal EntrenchmentSpohn 88 • Epistemic state S maps worlds to ordinals • Belief state of S = b (S ) = S -1(0) • Determines “centrality” of beliefs • Model: orders of infinitesimal probability w + 1 w 2 1 B = b (S) 0 S

  15. Belief Revision Methods * takes an epistemic state and a proposition to an epistemic state S b(S) S’ E * b (S *E )

  16. Spohn Conditioning *CSpohn 88 b (S ) S

  17. E Spohn Conditioning *CSpohn 88 new evidence contradicting b (S ) E b (S ) S

  18. E B’ Spohn Conditioning *CSpohn 88 E *C b (S ) S S *C E

  19. E B’ Spohn Conditioning *CSpohn 88 • Conditions an entire entrenchment ordering • Perfect memory • Inductive leaps • No epistemic hell on consistent sequences • Epistemic hell on inconsistent sequences E *C b (S ) S S *C E

  20. Lexicographic Updating *LSpohn 88, Nayak 94 S

  21. E Lexicographic Updating *LSpohn 88, Nayak 94 S

  22. E B’ Lexicographic Updating *LSpohn 88, Nayak 94 • Lift refuted possibilities above non-refuted possibilities preserving order. • Perfect memory on consistent sequences • Inductive leaps • No epistemic hell *L S S *L E

  23. Minimal or “Natural” Updating *MSpohn 88, Boutilier 93 B S

  24. Minimal or “Natural” Updating *MSpohn 88, Boutilier 93 E B S

  25. Minimal or “Natural” Updating *MSpohn 88, Boutilier 93 • Drop the lowest possibilities consistent with the data to the bottom and raise everything else up one notch • inductive leaps • No epistemic hell • But... E *M S S *ME

  26. Amnesia • What goes up can come down • Belief no longer entails past data E

  27. Amnesia • What goes up can come down • Belief no longer entails past data E’ E *M

  28. Amnesia • What goes up can come down • Belief no longer entails past data E’ E *M *M

  29. The Flush-to-a Method *F,a Goldszmidt and Pearl 94 B S

  30. E The Flush-to-a Method *F,a Goldszmidt and Pearl 94 a E E S

  31. E The Flush-to-a Method *F,a Goldszmidt and Pearl 94 • Send non-E worlds to a fixed level a and drop E -worlds rigidly to the bottom • Perfect memory on sequentially consistent data ifa is high enough • Inductive leaps • No epistemic hell a E E *F,a S S *F,aE

  32. Ordinal Jeffrey Conditioning *J,a Spohn 88 S

  33. Ordinal Jeffrey Conditioning *J,a Spohn 88 E E E S

  34. Ordinal Jeffrey Conditioning *J,a Spohn 88 E E E a S

  35. Ordinal Jeffrey Conditioning *J,a Spohn 88 • Drop E worlds to the bottom. Drop non-E worlds to the bottom and then jack them up to level a • Perfect memory on consistent sequences if a is large enough • No epistemic hell • Reversible • But... *J,a E E E a B B’ S S *J,aE

  36. Empirical Backsliding

  37. Empirical Backsliding E a

  38. Empirical Backsliding • Ordinal Jeffrey conditioning can increase the plausibility of a refuted possibility E a

  39. The Ratchet Method *R,a Darwiche and Pearl 97 S

  40. The Ratchet Method *R,a Darwiche and Pearl 97 b + a b E S

  41. The Ratchet Method *R,a Darwiche and Pearl 97 • Like ordinal Jeffrey conditioning except refuted possibilities move up by a from their current positions • Perfect memory if a is large enough • Inductive leaps • No epistemic hell b + a b E *R,a B B’ S S *R,aE

  42. Part II Belief Revision as Learning

  43. Iterated Belief Revision • (S0 * ()) = S0 • (S0 * (E0, ..., En, En+1)) = (S0 * (E0, ..., En, )) * En+1 S0 S0 S1 S2 E0 E1 * b (S0) b (S1) b (S2)

  44. A Very Simple Learning Paradigm outcome sequence mysterious system 0 0 1 0 0 possible infinite trajectories e n e|n

  45. Empirical Propositions • Empirical propositions are sets of possible trajectories • Some special cases: e “fan” k n s [s] = the proposition that shas occurred [k, n] = the proposition that k occurs at stage n {e} = the proposition that the future trajectory is exactly e

  46. Trajectory Identification • (*, S0) identifies eÛ for all but finitely many n, b(S0 * ([0, e(0)], ..., [n, e(n)]) = {e}

  47. Trajectory Identification • (*, S0) identifies eÛ for all but finitely many n, b(S0 * ([0, e(0)], ..., [n, e(n)]) = {e} possible trajectories e

  48. Trajectory Identification • (*, S0) identifies eÛ for all but finitely many n, b(S0 * ([0, e(0)], ..., [n, e(n)]) = {e} b (S 0) e

  49. Trajectory Identification • (*, S0) identifies eÛ for all but finitely many n, b(S0 * ([0, e(0)], ..., [n, e(n)]) = {e} b (S 1)

  50. Trajectory Identification • (*, S0) identifies eÛ for all but finitely many n, b(S0 * ([0, e(0)], ..., [n, e(n)]) = {e} b (S 2)

More Related